In the start of every football season, the first few games are often a surprise in terms of how well or how poorly some teams play. One team yesterday, picked by many to be a potential Superbowl, participant barely eked out a victory over a rival that isn't expected to fare quite as well. Other teams won by wide margins over teams expected to be much more competitive.
A football team goes through a lot from the end of one season to the beginning of the next, from drafting young players, to signing or losing free agents, and sometimes even through coaching changes, adoption of new strategies and approaches, personnel changes in front offices, and more. There are always teams that emerge out of nowhere to win more than expected, and other teams that don't live up to pre-season hype.
There are a lot of eyeballs on those teams and their players, from the press to the front office, fans to fantasy team owners, professional scouts and amateurs who prognosticate in forums. There are many ways that someone can judge the talent on a team, and their chances of winning, including draft day evaluations, unofficial scouting reports, reporter's head-to-head evaluations.
This stage of the football season reminds me of how we often evaluation websites and how well they might rank in search results. Rankings are often based upon a combination of information retrieval (IR) score involving how relevant a page might be for a particular query, and importance scores such as PageRank. There are other factors as well that come into play. For instance, if a query includes the name of an entity, and a search engine as associated a particular website with that entity, it might rank well for that query even if it might not be the page or site with the highest combination of information retrieval score and important score. But, because of the association, it might even be listed multiple times at the top of a set of search results.
There are other methods of ranking and re-ranking search results that may cause other pages to rank above where we might think they should based upon IR score and Importance score. For example, Google will sometimes include localized organic results in rankings for pages as a way to make those results more relevant for people living in particular areas. So, on a search for "hospital" for instance, one or more of the top ten results you see might be for a local hospital, even though it doesn't have the highest IR and Importance scores.
What can be even more challenging is that regardless of those different re-ranking approaches, we don't know how much of a difference in scores there might be between a results showing up as the top result for a query, and the second result, or the third result, or so on. We don't know if the top result is a potential Superbowl contender, or just a little better than the result immediately below it. And we don't always know whether some special re-ranking factor is in play that might raise it to the top.
When we work on a site to improve its quality, and make it more relevant for a particular query, we can't be certain how much an improvement we might see by changing titles to make them more descriptive and more engaging. We don't necessarily know the impact of adding more quality content to a particular page that might be relevant to a specific term. We need to make those changes, and believe that by improving the quality of a page, we create the possibility that our efforts will result in more traffic, a higher ranking, and a better experience for visitors to a page. Often we need to be patient and wait to see what types of results such changes have.
A football team taking steps to improve from one season to the next works on their approach to drafting players, attempts to make smart choices in signing free agents, tries to hiring great coaches, runs a smart training camp, and puts together a playbook that takes advantage of the strength of its players and the weaknesses of opposing teams.
Likewise, when someone does SEO on a site, they try to meet the objectives of a site owner, understand the audiences for a site, and make it easier for the two to mutually benefit from being able to find one another. Smart SEO builds the foundation for that kind of engagement by building a strong foundation for a site so that it can be easily crawled and indexed by a search engine, uses the words that searchers interested in what the site owner offers on the pages of the site, and provides a good user experience once they arrive on those pages. The first step in any SEO campaign is in preparing a site to be able to compete.
Pages I enjoyed this past week:
One of the really eye-opening articles that I saw last week echoes my feelings about what Google is building with Google Maps. The Atlantic's article, How Google Builds Its Maps—and What It Means for the Future of Everything provides a look at Google Maps as both a challenge for Google, and a tool that has greater implications that just making it easier for us to find businesses or get driving directions. Perhaps the most telling aspect of what Google is trying to do with Google Maps comes in this statistic from the article:
In keeping with Google's more-data-is-better-data mantra, the maps team, largely driven by Street View, is publishing more imagery data every two weeks than Google possessed total in 2006.
While posting to Google Plus last week, I found a presentation that I really enjoyed, and I wanted to make sure that everyone saw an image that I really liked from it. I copied the image, and posted it as an image post rather than a link post, in Google Plus. This way, the post had a large image showing instead of a thumbnail. I included a link to the presentation in the body of the Google Plus post. Funny, but the Google Adsense blog suggested doing the same thing last week, with their post Social Fridays: Use images to deliver a richer experience.
That presentation is Crowdsourcing for Search Evaluation and Social-Algorithmic Search. It's a long one, but definitely worth spending some time with as it explores ways to crowdsource determining the relevance of search results. Given my intro above about evaluating talent and relevance, it's interesting to see this presentation by Matthew Lease of the University of Texas at Austin, and Omar Alonso from Bing, which explores alternative ways of evaluating search results. Below is the image from it that I liked so much:
Linking to your own site from a page or blog post can be a very relevant exercise, or it can look like you're trying too hard to gain relevant anchor text to your own pages. James Mathewson, from IBM, explains approaches that he uses to do that in Three Types of Relevant Internal Links to Boost SEO.
Retro Post of the Week
In this week's look back at pages and sites that I've found incredibly useful and helpful in the past is the Stanford Credibility Guidelines. While I've mentioned above about how both football teams and search engines evaluate the talent they have or the relevance of pages in search results, it's also important to keep in mind how visitors to pages evaluate the businesses and offerings that they see.
The Stanford Persuasive Technology Lab put together the guidelines on how people might judge the credibility of a site, almost a decade ago. I've been following these guidelines as much as possible since then, and they include the kinds of things that can really make a difference. You may want to add them to your playbook.