Everyday, many businesses spend large amounts of time and money on their SEO campaigns only to find that their site is not budging in rankings. Most site owners and SEO agencies put most of the emphasis on establishing a good link profile and conducting keyword research. While these are both very important to the success of an SEO campaign, many developers and site owners overlook some important aspects of the site’s structure and backend that will actually hurt your search rankings. With recent algorithm changes (Google Panda and Penguin updates), search engines are gaining more ability to crawl sites the same way people view them.

Page Speed

The speed of your pages is all about setting good usability standards. While Google may not factor your sites speed too much in your rankings, it is important to make your site as fast as possible because the faster your pages load, the less time it takes for the Googlebot to crawl them. From a conversion standpoint, page speed is very important. A user is much more inclined to continue browsing a page that is quick and working properly. Slow loading pages end in lost visitors every time.

Heavily coded pages create slow loading pages. There are some tips and tricks you can implement that will increase the speed of your site, making the experience for your users much friendlier:

  • External CSS: Your site’s CSS should be located in a separate file on your server, and not inline on every page. Include all of your CSS into 1 external style sheet to reduce HTTP requests enabling each page to load faster.
  • Caching and Compression: Browser caching allows your browser to load previously downloaded material from your machine rather than over the network, and text compression compacts resources sent over the network, reducing download time. Two of the most popular methods of compression are GZIP and Deflate. Implementing text file compression and leverage browser caching will depend on the hosting and CMS you are using. If possible, using these tools will dramatically increase your site’s speed.
  • Optimized Images: Reducing the file size of your images will dramatically increase the speed of your sites pages. The trick is trying to reduce file size without losing too much image quality. There are many free and paid tools available for image compression and optimization.

Indexation and Crawling

Insufficient crawling and incomplete indexation is a major downfall for many sites. It is important to ask yourself, through every stage of the development process, if your site can be crawled properly by search engines. There are ways to ensure Google, or any search engine, will be able to see your site properly and crawl it thoroughly:

  • XML Sitemap: An XML Sitemap is used to help search engines find all of the pages of your site that you want crawled. It should include every page that you want indexed and should be updated often. For larger sites, it is common practice to break up your pages into several smaller sitemaps. It should be stored on the top level of your server and a reference to your XML sitemap should be included in your Robots.txt file as well (see below):

sitemap: http://www.mysitedomain.com/sitemap.xml

  • Canonicalization: Basically, Canonical URLs ensure that your link flow isn’t distributed among different variations of the same content. For instance, if your sites homepage can be accessed from www.yoursite.com, yoursite.com, www.yoursite.com/index.html you will not see any difference on each page from a user standpoint. But this is telling search engines that these are 3 completely different pages, thus causing you to lose page authority because the juice given by the crawler is being split up among the three different URLs instead of just being distributed to the one canonicalized version. So using correct canonicalization means using the URL structure in all outbound and inbound links site-wide.
  • Robots.txt: This file is stored on the top-level of your sites server and prompts search engines whether or not certain files and/or directories should be crawled and indexed. Depending on your setup, there are particular server directories that should not allow crawling.

Page Structure and Hierarchy

One mistake that is made quite often is the misuse of HTML heading tags on a site’s page. Heading tags (<h1> <h2> <h3>) are used to indicate the headlines of your pages that tell search engines what the page is actually about. For instance, the H1 tag of every page should be as descriptive and relevant to the pages content as possible. Sub-headings within the content under the H1 tag should be tagged as H2′s, sub-headings within the content of the H2 tags should be tagged as H3′s and so forth.

The use of heading tags is what helps search engines determine whether or not your content is relevant to the search query. Also, the use of these tags breaks up your pages into sections that make it much easier for robots to crawl and digest your content.

The most common misapplications of heading tags are:

  • Using heading tags for blocks of text (they  are only for headings, and should be short and descriptive)
  • Using the H1 tag multiple times per page (it should only be used once per page as the main heading for that page)
  • Hiding the H1 tag or placing the logo inside of the H1 tag

The examples above are just a handful of tips that site owners should utilize when developing with SEO in mind. You can easily avoid complications with search ranking and penalties from Google’s Panda and Penguin algorithm updates if all aspects of your site’s structure is taken into account.