Jerry Smith's CIS 625 Web Design Journal

Resources for Making Your Website Work For You

Welcome

This site provides abstracts of articles as well as links to various web design resources that will aid IT managers in making web design decisions. The motivation for creating this site comes from a requirement for a graduate class taken at Morehead State University entitled Web Information Systems and Internet Technologies. Each week, newly abstracted articles will appear on this homepage. Previously abstracted articles can be found in the Abstract Archive. As part of the learning process, the site has been created with Microsoft Expression Web 2.0.

Most Recent Abstracted Articles
Week 6: April 13, 2009 through April 19, 2009

Password Attack Discussion & Benchmarks

Alan Amesbury16 of the University of Minnesota's Office of Information Technology provides an excellent write-up regarding passwords and how both the number of possible characters, the length of the password, and the hashing algorithm can really effect how long it takes to crack a password:

  • The easiest thing to do in order to make a password encoded with any algorithm harder to crack is increase the number of possible characters used in the password. To illustrate this, Amesbury assumes a standard length of 7 characters, but instead makes the password case sensitive, meaning that one password has 26 possible characters while the other has 52. The resulting difference is about 4 orders of magnitude more combinations!
  • The best password policy involves all letters (case-sensitive), all digits, all symbols (i.e. shift+digits), and a space. This provides 69 total characters. A password with 7 characters and 69 possible characters provides over 7 trillion combinations.
  • Increasing the size of the password by just one character (all other things constant from above) increases the combinations to over 513 trillion.
  • A good hashing algorithm is purposefully CPU intensive, such that hashing a password 1 time is no big deal but trying to hash a bunch (as in a brute force attack) is too slow to be practical.
  • Using a 3.2Ghz Xenon, Amesbury found that it would take 95 years to brute force through all possible passwords hashed with Microsoft's NTLM given a variable character length between 1-8 characters with 69 possible characters. FreeBSDs MD5 hash given the same parameters would take more than 11,000 years!
  • Given the data he presents, Amesbury suggests that passwords be variable length of at least 6 characters, contain mixed case letters, and at least a symbol, and no part of the password should be anything found in a dictionary. Dictionary attacks on even FreeBSDs hash can succeed in under 15 days.

Biggest Mistakes in Web Design 1995-2015

Everyone's favorite cranky web design consultant Vincent Flanders17  has compiled a list of the most common things that bad webmasters do. His best suggestions are:

  • Your site is only important to potential surfers if it does something useful for them.
  • If visitors to your site can't figure out the purpose of your site within four seconds, the site is not doing its job.
  • Design should not get in the way of the purpose of the site. Even if it is pretty, if a design keeps visitors from what they came to the site for, scrap it.
  • Don't put too much stuff on one page, and certainly don't put too much different types of stuff on a page.
  • Don't think your visitors are going to care too much about web standards. While adhering to standards is good, your visitors only stick around if the site is useful to them.
  • Be careful with the use of images, Flash, and Javascript. Only use these elements if they add actual benefits to users.
  • There is nothing wrong with making your site look and behave like other successful sites.  Being totally different with navigation or design such that your site looks nothing like any other site will probably confuse many of your visitors.

Practical Tips for Government Web Sites (And Everyone Else!) To Improve Their Findability in Search

Vanessa Fox18 at O'Reilly Radar believes that government websites will only be useful if their contents can be easily found with search engines. She says these recommendations are vital for government sites, but also important for non-government sites too:

  • Sites should create well-formed XML sitemaps.  If the site is structured well enough so that a sitemap can be created to explain its contents, the its contents are most likely logically organized.
  • While the sitemaps do not have total control over what search engines do, they help steer the search engine toward what is most important.
  • Make sure any public content is accessible without requiring registration or specific user input. Search engines (and oftentimes users) will abandon their quest for information if some sort of input gets in their way.
  • If file names and locations change, make sure to serve up a 301 Resource Permanently Moved redirection.
  • Make sure to include ALT text with images.
  • Make page titles unique and ensure they are titled such that the title actually describes the page.
  • Make sure pages are functional and informational even if Javascript and images are turned off. This helps search engines (as well as users) see important information even if these features are turned off or unavailable.

 

Abstract Archive List