A Web Design Paradox: Humans vs. Robots
WebiMax Contributor, February 25, 2010
We've all heard that content is the most important part of a website. It has even been crowned "King". We also know that if a website is difficult to find, it basically does not exist.
Hard To Find = Does Not Exist.
This is why Search Engine Optimization is so critical to your online marketing.
While Humans ultimately buy the products and services offered on the web, those products and services would rarely be found or purchased if the robots didn't find and index them first. So, do we design websites for the humans or the robots?
Option 1 - Design for Humans:
It's been proven through various forms of web usability testing and research that web users generally don't read our pages... they scan them. In "Don't Make Me Think: A Common Sense Approach to Web Usability", author Steve Krug sums it up nicely:
When we're creating sites, we act as though people are going to pore over each page, reading our finely crafted text, figuring out how we've organized things, and weighing their options before deciding which links to click.
What they actually do most of the time (if we're lucky) is glance at each page, scan some of the text, and click on the first link that catches their interest or vaguely resembles the thing they're looking for. There are usually large parts of the page that they don't even look at.
We're thinking "great literature" (or at least "product brochure"), while the user's reality is much closer to "billboard going by at 60 miles an hour."
There are a number of proven methods to entice humans to engage with our message. The three main guidelines for writing for the web include the following:
- Be Succinct. Write no more than 50% of the amount of text used in print publications.
- Write for Scannability. Don't require users to read through dense copy, which on the web sounds like Charlie Brown's school teacher... "Whah, whah, whah, whah, whah, whah". Instead, write short paragraphs, subheadings, and bulleted lists.
- Hire Professionals! Good content requires a dedicated staff that knows how to write for the web and how to massage your content into your website design layout for optimal read... I mean... scannability.
Option 2 - Design for Robots:
In order to successfully get content in front of card-carrying humans (a.k.a. potential customers), websites need to be structured so they are easily indexed by search engine robots (a.k.a. crawlers, spiders). Search Engine Optimization depends largely on keywords and key phrases, so writing keyword-rich copy is absolutely critical to increasing search engine rankings.
There are a number of proven methods to optimize web content for search engines. The three main guidelines include the following:
- Generate Keyword-Rich Copy. Content needs to works well at delivering your message to your Human visitors, while making your targeted keywords and key phrases easily indexed by Robots. Use your keyword phrases in headlines, title tags, in the first paragraph, the top of the HTML document, and in alternative text on images. But be careful not to overdo it - as you don't want to appear to be keyword stuffing.
- Develop Accessible Markup. Accessibility is not just for the visually impaired. The more accessible your HTML pages are, the easier it is for search engines to read and rank them.
- Create a Detailed Site Map. Submitting a Sitemap XML file to the search engines helps them understand how to crawl and index all of the pages, including the frequency that the content changes.
While writing for robots is essential to SEO, don't stress mechanical search engine optimization so much that user's needs are forgotten. We must provide content on our websites in a format that supports the way both Humans & Robots use the web. We must write for human visitors first, and then optimize our code and content to help search engine robots find and index our pages.