To the technical uninitiated, it can be hard to tell whether or not a site needs SEO work or not. What follows is a general SEO litmus test you can perform on your site, using only a browser. No fancy tools, litmus paper, or black magic required.
The process covers three primary areas:
Proper indexing – ensures the correct pages on your site are appearing in Google’s results, and that their listings are displayed optimally.
Architecture – primarily concerned with creating a logical, linear path through your site, ensuring deep pages are crawled.
Performance testing – ensures the site’s speed is up to par, by reviewing site code and server settings.
Search engine result pages (SERPs) are the primary battleground in SEO, and proper display of your site can make or break a campaign. Below is a standard example of a SERP result.
Each web page on the internet should contain two distinct and unique areas, the <head> and <body>. The head contains information that is invisible to the user on the page, unless it is requested. Some of this information is called meta data, which can tell search engines important information about your page. These requests take the form of title bar information in browsers, as well as various areas of the SERPs. Proper engagement of these and other website resources help provide the best representation of your pages in search results as possible.
Proper Indexation, or How I Learned to Ignore my Siamese Twin
Each SERP listing represents one URL on any given site. Only one URL, and therefore listing, should correlate with each page on your site. One important SEO concept to understand is the difference between a page and a URL. A URL is an address to a particular location on the web. This “address” can sometimes be replicated in various ways, which causes two URLs to exist that reach the same page.
For example, an ecommerce site may have the same product in different categories, which creates two URLs that can serve an identical product page.
Google indexes URLs, so even though the page is the same, each URL is seen as a duplicate listing. Splitting indexation not only reduces the amount of “credit” (or PageRank) each listing is receiving, but can also cause devaluation through duplication of content, which is an issue to Google.
Copy content from whatever URL you want to rank for a particular page, and search for it in Google, limiting the search to the site.
Site:yoursite.com “content, enough to differentiate from other pages”
If your site has limited amounts of content, search for the title of the page instead of content
Site:yoursite.com intitle:”page title”
Possible Results and Issues:
I didn’t find any results, what now?
Enter the url alone with the site operator, as shown:
If no results are being shown with this test, it may indicate an actual block to indexation exists on site. If this does occur, test every level of your site to see if it is a widespread problem. The root cause would need to be explored more by an SEO professional.
I Found multiple copies, what now?
There are multiple ways to restricted off duplicate content, but an SEO professional should help you identify the best method. Attempting to de-index parts of your site without proper advice can lead to unintentional de-indexation of large areas, or the entirety, of the site.
I Found One Result, what now?
If you found one unique result, examine the listing. For reasons we will cover in the next section, we recommend a title tag that reflects site structure, and does not exceed four levels hierarchically.
Might have a title
Widgi Blue Pro Kit | Self Reciprocating Widgets | Widgets.com
This title implies the site structure to Google, as well as the user.
All meta descriptions should include a call to action and alternate contact method, usually a phone number. Additionally, meta descriptions that do not contain the query that brought up the result may be pulled from on page content surrounding that term. Similarly, titles can be rewritten by Google if too long, or not descriptive. An SEO professional can help minimize this by crafting a keyword strategy that targets the appropriate queries on page. Using the above example, a meta description might look like:
Since 2003, Widgets.com has been the premier widget provider in Eastern Canada. Explore our expansive online catalog of premium widgets, or call us today at 555-555-5555.
Ensure that page titles are no longer than 70 characters, and descriptions do not exceed 160 in length.
Website Architecture, and other Pyramid Schemes
Assuming that all of the pages on the site are singularly indexable, and properly display their meta data in the SERPs, site architecture must be examined. Site architecture refers to the paths through which the crawler will navigate the site in question.
Establishing linear linking paths is very important to ensure that the whole site is crawled appropriately. In addition to ensuring indexation of the whole site, Google uses the paths it crawls, and the subsequent URLs seen, to determine the heirarchy of site pages. Often sites utilize redundant navigation in the form of sidebars, blogrolls, or large footer sections, all of which can create a web like network that can confuse Googlebot.
Chaotic linking design that provides too many pathways around the site, in a non-linear progression burns through what is known as crawl budget. Crawl budget is a estimation made by the crawler about how much time should be spent on site. Entities known as “crawl traps” can burn through this crawl budget by offering an infinite amount of navigation choices from the page in question. Ecommerce filtering, which changes the url based on the order in which filters are clicked is a prime example. When a crawler burns through budget, or becomes stuck, it may abandon the site without further exploration of deeper pages.
The example above shows a linear progression of tiers established by using intelligent navigation design. From the home page, navigation directs to category pages, which in turn have unique menus that direct to the deepest pages on the site. Site depth should be kept, ideally, to three steps below the domain. Deeper pages run a risk of not being crawled regularly, or at all. Interlinking can occur between pages within the same group, or silo, but should not cross into other silos.
Check each tier of the site against the Google SERPs, using the site operator (Site:site.com) If a page fails to load, check a sampling from the same tier. If they are also not indexed, it can indicate a problem with the bot diving deep enough.
Possible Results and Issues:
A particular tier cannot be found in the index, what now?
Barring indexation problems, site architecture should be examined for a few key issues.
If it brings up a menu that looks like the one below, it is a flash link.
Most Browsers when you right click on a link have a menu similar to the one below:
Clicking “inspect element” on a particular link will bring up the code behind it.
Beyond pathway issues, crawl traps may be to blame as well. Check to see if filtered pages, contact forms, or other complex mechanisms are indexable. They may have to be restricted off, with alternate pathways put into place. To do this, use the site: operator we have used previously to see whether or not a page is indexed.
Bring a Jet to the Drag Race
Google’s ostensible goal is that of providing the best model of user behavior online as possible. It should come as no surprise that sites speed, which can have a tremendous impact on user experience, is also factored into Google’s algorithm.
One of the quickest gauges of site performance is Google Pagespeed Insights.
The interface will prompt for a domain, then provide a list of optimization suggestions and a score. Fix any issues listed in high or medium priority, and it will have a dramatic impact on your score.
On the sidebar there is a link listed for “Critical Path Explorer”. This displays a timeline of page load, broken down by element. This can help you visualize the amount of time it takes for a page to load. Generally speaking, a second or two is ideal for most sites. Large ecommerce may take longer.
If this posts helps you recognize any red flags, feel free to reach out to Webimax. Our team of SEO professionals has the support of a massive in house team of inbound marketers, writers, developers, and designers.
Google’s ostensible goal is to create the best corollary to how users browse the web. Without impediment, this would allow Google to eventually provide near-perfect information to the user. Whether by commission or naiveté, the SERPs are not free of the static noise that is SPAM. As such, any methods Google perceives as disruptive to this goal will eventually be devalued or penalized, in an effort to clean up the results.
Traditional SEO has become a balancing act of effectively responding to increasingly aggressive updates from Google. Agile response to these updates is crucial, as many techniques are still effective, even if they don’t necessarily gel with Google’s mission statement. This agility must be tempered with acceptance of Google’s process, and embracing of more “bulletproof” methods.
The more natural the method, the higher its inherit power. The more control over a method a webmaster has, the more potential it has for exploitation. Onsite optimization is a far weaker signal than a citation. Citations imply the relevance of the site to others to the engine, whereas optimization ensures proper indexation, and definition of site hierarchy and content. Optimization is still important, but implies a more technical perspective in attempting to garner traffic, through manipulation of rank.
Humanization will be the new optimization. Content must be crafted for people, not regurgitated banal brochure copy crammed with alchemically-devised mixtures of anchor text and keyword densities. “Linkbuilding” is dwindling in deference to inbound marketing, relationship building, and outreach to like-minded individuals. Social interaction isn’t icing on the cake anymore, it is a necessary avenue from which sites must garner traffic.
(Blueberry / Cherry)
Content Strategy vs. Page Optimization
Onsite work has always been divided into two camps, content and speed. Speed is purely technical, and will always remain a constant. This is not to say that new technical standards may not benefit the way in which sites are indexed, cataloged, or presented, but it is a scientifically objective avenue.
While a certain amount of attention needs to be paid to meta data, headings, and content in terms of attempting to gain ranking for particular keywords, it is only establishing a definition of your pages content for the SERPs. Content needs to provide something other than repeated keywords.
Google scrutinizes duplicate, thin, or “overoptimized content”. Content should be natural, unique, and of an appropriate length for its purpose. If content is on a page purely for SEO, the existence of that page should be questioned.
(Guess The Smell! Answers at the End of the Post!)
All of the items above add value to a site’s content, beyond simple information. Some are harder to implement than others, but the return is much higher. Often, content can be generated by the user given the correct platform.
• Any informational site can craft evergreen content. Create tutorials for how to use your product, craft case studies for on-site contractor work, show people the value of your product over another.
• Gamification has become increasingly popular. Sites offer onsite rewards, certifications, or levels of achievement that may, or may not, have any bearing on the outside world, but inspire the user to continually engage the site.
• Often users are attracted to the opinion of others on current events, politics, or whether Star Wars is going to become a Jar-Jar fest with Disney at the helm. People crave interaction online. Sometimes even just a strong, opinionated, personality can draw people.
• Staying on top of news, regardless of topic, can pull users in. If you’re the first to report on certain topics, especially in niche spheres, it can by a massive boost to your perceived authority, garnering interaction and citation.
• I admit, humor could have fit in with other sections, but very few things bring as much repeat traffic. If you can get on someone’s funny bone, you’ve got a repeat visitor for life. Just don’t let them down.
• User generated content is a powerful thing that can quickly grow out of control, in good and bad ways. Many of the most popular sites on the net are fueled purely by user generated content. Consistent curation and censorship is necessary, but the returns can be limitless.
• Rich media powerfully augments any standard content. Video especially can aid in explanation, entertainment, or humanization of a conversation. With the constant decrease in cost and increase in quality of video equipment, it is becoming increasingly easy to create professional rich media.
• Social interaction does not have to be relegated to offsite resources, start discussions on your site. Bring on guests to spark continual interests. Host offsite events. Introduce users to other users. A constantly growing community is one of the most powerful tools a site can yield.
So now you’ve got powerful, engaging content. How do you get people to see it?
Inbound Marketing vs. External Linkbuilding
Linkbuilding has been one of the most abused areas of SEO for a very long time. Often, inbound linking resources were constructed with low quality, duplicate, or non-existent content. Because anchor text matched, they would still imply relevance to Google. Legitimate properties have existed for just as long, but were easily outmatched by overzealous submission to article directories, social bookmarking sites, and other artificially malleable sources.
Since the last wave of Google updates, included the dreaded Penguin, these practices have been largely devalued, leaving many sites with the proverbial rug pulled out from under them. Additionally, when sensing significantly artificial link velocity, Google penalizes sites, crippling their ranking power for the offending anchor text.
(Scratch Penguin’s belly! It smells like sadness!)
So how does one imply relevance without bulk loading one’s diet with artificial domains? Content crafting shifted from emphasizing optimization to humanization, and linking follows in lock-step. While attractive quality content that fulfills a consumer need can be all you need to garner an impressive link portfolio, it is not only avenue.
Social Integration vs. Community Leadership
Pure organics should be supported by effort-driven social engagement. Every site has the potential to become part of a community, industry or otherwise. Local meetups, charities, or other personal interest groups can connect people in ways business relationships can’t.
Social interaction used to be the icing on the cake, a sidecar to powerful SERP rankings. It has become increasingly important, with the decreasing effectiveness of artificial SERP posturing, to develop a decent social profile. The more a profile interacts, and adds to the universal conversation, the more it infers authority to the rest of your sphere.
There are two important aspects to any social campaign. Cognizance of social discussions, trends, news, and events is paramount. This shows that you are a true player upon the social stage. Tempering this is consistency of message, voice, and brand. This inspires user trust in the stability of your brand, as well as allowing them to adopt your site into their social routine. Basing social actions on these two avenues helps grow your authority.
Once established as social player, a few things will happen:
• You will likely garner more organic links to your properties, as people discuss your interactions.
• You can help lead the industry discussion, often to your benefit
• You may be asked on other properties to weigh in on industry topics
• Industry players may be willing to appear on your properties, weighing in on industry topics
Tracking Success in a Post-Penguin, Post-Panda World
Since its inception, the SEO industry has been hung up on ranking and SERP traffic. While organic traffic will always be important, and a reflection of ranking performance, one must not discount other metrics. First, just because the SERPs are sending traffic through the site does not mean that traffic is converting. Social traffic may have higher conversion potential, even if it is volumetrically weaker. Once relationships with other sites are established, direct channels from those referrers may also garner a significantly higher conversion percentage.
To compound the issue, in most cases a conversion is a monetary transaction, but this is not always the scenario.
• Information Gathering – A conversion may be a contact form, lead capture, or crowdsourcing.
• Content Generation – An upload of user generated material
• Social Interaction – The goal of the site may be to generate online or offline event participation
• Internal Operations – It may be a recruiting tool for company careers, or an education tool for current employees.
Because of the wide gamut of possibilities, it is imperative to properly quantify your site and what conversion is.
Humanization is the Future, Welcome to the Atomic Age of SEO
What makes the internet so powerful is its ability to bring people together, regardless of location. The shift from optimization to humanization is occurring, ushering in a new age for SEO and the internet. Unique and effective content, outreach, and community engagement will all become integral in the success of any campaign.
Feel Free To Embed Our Atomic Infographic!
(Lime, Banana, Blue Raspberry, Cherry, Grape, Lemon-Lime, Orange, Shame)
While not all websites need to engage SEO at a local level, many businesses rely wholly on the surrounding community, and the associated physical and electronic traffic. In a technological environment where mobile usage is skyrocketing, empowered by innumerable GPS-driven applications, proper geographic listing, and engagement, is essential.
On Page Markup
There are a number of ways to mark up site pages to further define attributes, such as geographic location, ratings, and reviews to Google. SEO is largely about further definition of your site to aid Google’s total view of your properties, and Local SEO is no different. Failure to define certain aspects leads Google to make an educated guess.
Microdata is a series of on page definitions that can be leveraged to imply further geographic relevance to Google, by eliminating the ambiguity machines encounter when attempting to define a page.
Take the searches to the right. If you type in karma, the number one result is about the religious concept.
Philadelphia, a geographic keyword, prompts Google to search for geographically relevant results, which are defined through a variety of means, including microdata.
In a similar way to geographic data, one can define areas of a page to Google for it to display in the SERPs. These range from raw ratings to caloric count of recipes.
In terms of local search relevance, much of local search is conducted from mobile devices. Often mobile users are looking for spur of the moment business reviews. Better SERP visibility could make the difference between obtaining a new customer or not.
Local pages should be optimized with an official Google places map, as well as directions from local landmarks, shown to the left. This requires a full and complete Google Plus Local profile. Multiple locations should have separate pages for each one.
Cross Site Local Signal Optimization
To cement the location of your business in Google’s mind, a consistent phone number and address should be populated in the footer on every page on the site. Footer address should reflect either the main location or sub location if the locations are divided up.
On-site optimization is strengthened by complete and thoughtful listing on external directory sites. Primarily, sites should be listed properly on Google, Bing and Yahoo business pages. Secondary listings on review and geographic social sites, such as Yelp, Foursquare, or Urbanspoon should be engaged when all of the primary listings are in place. Traditional listing resources are becoming more and more web-oriented, engage your local papers, radio, or TV.
High Quality Directories
Listing sites are generally of a high domain authority, appearing high up in Google’s search engine result pages. This means that as your site climbs the SERPs, you may already have an established presence at the top waiting for you. They are also terrific resources to engage user comments, whether positive or negative. Each review helps you craft future engagements with customers.
Business Citation Management
Citation management sites, such as Localeze and Yext, allow for single-point bulk optimization and distribution of consistent directory listings, utilizing rich media when possible. Built in analytic and feedback processes allow for mass corrections based on user feedback. This type of centralized control allows for accurate, up to date listings that can be changed in bulk, on the fly.
Outreach to Influential Yelp Users
Local decisions are very often driven by customer feedback, rather than sheer proximity. As such, it has become increasingly important to have customers post reviews on your site.
Outreach to high power users, in relevant geographic areas, who have reviewed similar products can provide quality feedback that can sway other user’s opinions. Yelp, the preeminent king of review sites, offers the perfect platform for outreach.
First, one can differentiate between registered users and the desired “power” users. Any user’s Yelp profile displays their friend and rating counts to ensure outreach to an active, adequately followed user. There is also an “elite” badge that appears on power users.
Even if a user has a decent number of friends and reviews, their rating distribution should also be scrutinized for an even and natural spread. Often, users are turned off by 5 star reviews, and especially a high number of them, because they may appear unnatural.
Outreach should involve commenting on the users comments, and establishing a relationship prior to requesting a review.
Google Plus Optimization (Formerly Places)
Google Plus Local has taken the local search torch from Google Places. Proper optimization of this property is of extreme importance, as it dictates how Google displays the locations in local search.
Complete, accurate, and verified information is necessary to list in the local search results, which often appear after the first 2 SERP results.
Additionally, mobile users using GPS may use the listing to find the location. Inaccurate information may mislead these individuals.
Become a Community Player
Knowledge of the local industry environment is key. Engage your community in the real world. Events are great sources of actionable social media material, both before and after. In addition to the community, engage your industry, become a leader in your field, even if its just socially.
As beneficial as the optimization and outreach can be online, nothing beats person to person contact. There’s a level of trust and sincerity that one can imply socially that cannot be met by any degree of SEO.
Become the community leader. If you can’t, build your own community, or lead in what you can. Social, technical, business, there’s a niche for everyone.
Attend meetings, network, network others, build your presence, engage others online.
The local search “game” is about defining your site to Google as completely as possible. Whenever a new social or listing property rolls out, claim your local presence, and consider how you can use it to either further define your geography or customer base to Google.