My Philosophy of Web Design

Having spent years trying to figure out what makes a commercial Web site successful, I've come to the conclusion that it can all be summed up in two words: Simple sells. This principle applies to both the user experience (what I call "human factors") and the site's internal coding and structure ("machine factors").
Human Factors
When designing a site for a local business, it's important to keep in mind that people shopping for goods and services aren't looking to be entertained. They're looking for information. Anything on a site gets in the way of their finding that information (or that is otherwise unrelated to the company's business) is a distraction.
What people want when shopping for goods and services on the Internet are:
- pages that load quickly
- a clean, relaxed, uncluttered style that doesn't assault the senses
- intuitive, easy-to-follow navigation links
- informative, concise, clearly written content
- freedom from excessive moving images, sounds, pop-ups, or other distractions
Sites that are designed with these human factors in mind are more likely to convert visits to sales, to be bookmarked, and to be visited again in the future when the visitor has similar purchase intentions.
I also believe that a Web site should complement and enhance a client's existing marketing strategy, not redefine or distract from it. To the extent possible, I try to integrate the company's present color scheme, logos, truck wraps, or other designs into their Web site to help them build their brand and enhance their public recognition.
Machine Factors
Web sites should be designed an written for humans, but a smart Web designer also keeps the needs of machines in mind when developing a site. I call this "keeping the robots happy."
Sites whose code is efficient and conforms to Web standards perform more predictably than do sites that rely excessively on scripting technologies or "novel" coding techniques. This is especially true nowadays because so many users' computers are infected with spyware that may interfere with proper script execution. Ad blockers can also interfere with JavaScript execution.
Simple, standards-compliant coding is also much easier for search engine robots to crawl. I've taken over a few sites whose visual designs and content were perfectly fine, but whose code was so convoluted that search engine robots couldn't follow the links. As a result, these otherwise well-designed sites barely showed up — if at all — in search engine results. Their pages were invisible because search engine robots couldn't find them.
Avoiding excessive reliance on client-side scripting techniques is especially important when writing navigation menus. Some designers write elaborate JavaScript menus that may look very pretty, but deviate so far from standards that search engine robots simply can't follow them. That's why I like to code my navigation menus using pure, standards-compliant CSS.
In short, properly-coded sites that follow conventions and contain well-written content stand a better chance of getting good search placement, as well as getting potential customers to call your company. There are times when using client-side scripting is the best or the only way to do some things (for example, the theme switcher on this site uses JavaScript), but its use should be limited to those situations.
Please contact me for more information about how I can help you to improve your online presence and grow your business.