It was all too easy. You didn’t need to sign in … you didn’t need an Adwords account … and you didn’t get bid suggestions when you didn’t want them. It was an easy, ultra-credible source of search traffic statistics directly from the source – Google.
Replaced by the Keyword Planner, the Google Keyword Tool is no more. So we have to move on without it. We have to find new ways to find the best keywords for our Internet marketing efforts. And, in my opinion, there’s no better way to do so than going directly to the source once again – Google.
Google Keyword Planner
Yes, although the new Keyword Planner is slightly more complex and geared toward people interested in starting online ad campaigns, it still serves a purpose for those uninterested in ads.
We can still get all the monthly traffic and competition data we received with the Keyword Tool. The main difference between the old system and the new Keyword Planner is that, with the latter, your results page is muddied with bid suggestions. Also, more steps are needed to get to the information you want.
But you still get the information – the most dependable variety — from Google. Without a doubt, this is information worth taking a few extra steps to get.
When using the Keyword Planner, here are some things to keep in mind:
- You’re given four options instead of one on the main Keyword Planner screen. To get results similar to those produced by the Keyword Tool, choose option 2 – “Get search volume for a list of Keywords…”
- As you did with the Keyword Tool, use [brackets] for exact match and “quotations” for phrase match. Broad match is standard.
- Use the first option to get keyword ideas, as you automatically did with the Keyword Tool.
As you can see below with Keyword Planner, with keyword ideas absent on the main screen, the display of information is less obtrusive. The only information we don’t care for is located in the last 3 columns – suggested bid, impression share and “Add to plan” options. To break it down simply, the retired Keyword Tool was more consolidated and the new Keyword Planner is more broken up.
Note: You must create a Google Adword account before you can begin using.
Created in the UK, Keyword Eye embraces a fun, minimalistic approach to keyword research and is the easiest keyword tool to use. It’s also 100% reliable. Powered by SEMrush, it delivers keyword search volume, PPC competition and relevant keyword ideas. Also, if you do some tests, you’ll see that it produces information nearly identical to that produced by the Keyword Planner.
According to the website and confirmed by me, the free version includes:
- 10 keyword searches per day
- 10, 50 or 100 keyword suggestions per search
- 10 Google country keyword databases
Referring to this last point, since Keyword Eye is based in the UK, the search results for it are automatically based there. All you have to do if you’re running reports in the U.S. is switch results to “Google US” before you begin your search.
Having the ability to only choose results for one country per search is the tradeoff you make for the simplicity and freedom of Keyword Eye. And, of course, the ability to only complete 10 searches per day. But if you need to run a country-specific report in a quick, efficient manner, Keyword Eye is what you want.
Gone like a home run – not into the abyss.
Recently, I came across two great articles by Neil Patel, co-founder of KISSmetrics, and Brian Gardner, founder of Studiopress. They both had one word in common – LONG. They also confirmed what I already believed to be true: 1) long form content dominates search rankings and 2) long tail keywords promote higher quality traffic.
Long Form Content: Brilliant When Necessary
When Neil Patel says long form content converts more than short form content, he’s talking about high quality web content. He’s talking about a page that powerfully expounds on one specific point – not a page that’s unfocused and comes across muddled. Remember, even though Google is a machine, it’s a damn smart one.
More Quality Content = More Social Signals = Higher Rankings
Google is smarter than ever because it now reads social signals. That means the more tweets, likes, +1s and other social shares that your page has, the more authority it receives in search engine rankings.
And guess what receives the most social shares? Long form content.
In Patel’s article about content length, he uses one of his own famous blogs, Quick Sprout, to test word count’s effect on social metrics. To do this, he took the 327 blogs he’s written for the site and separated them into two categories: 1) blog posts under 1500 words and 2) blog posts over 1500 words. He then took the average number of tweets and Facebook likes received in each category and made a handy graph.
After crunching the numbers, Patel concluded that his posts over 1500 words received 68.1% more tweets and 22.6% more Facebook likes than his posts under 1500 words. This is just one small example, but it’s consistent with others I’ve come across during my time as a content writer at WebiMax.
Think about this: Google gives high quality long form content an advantage over high quality short form content published on the same day (assuming that each hosting website has similar authority). Because long form content ranks higher, more people are bound to look at it — and because the quality of the content is high, more people are likely to share it. This means higher rankings.
Recent evidence that the use of long form is growing: Google’s recent launch of in-depth articles.
Long Tail Keywords: It’s as Simple as Adding “What Is”
If you use any keyword tool, you’ll see that shorter terms have more competition and longer terms have less. Because the tools show that WAY more users are searching for the shorter terms, people are often tempted to try to rank for these.
Unless your website has superior domain power, however, it could take years – even decades – to rank on page 1 for a short, specific term.
You read it right – decades.
For this reason, SEO companies and web whizzes like Brian Gardner are targeting long tail keywords – keywords three words or longer. In Gardner’s article about long tail keywords, he confirms something that I discovered during my time working for a local BMW performance shop in Manayunk, Philadelphia: adding something as simple as “what is” to a popular term can have amazing results.
My own experience: As a marketing assistant at the performance shop, I developed the company’s content marketing strategy by using old school SEO tactics. I would write articles and post them on every article directory I came across: Ezine, Sooper Articles, Article Snatch, and others (recognition of my SEO ignorance at the time).
One day, I wrote a post on walnut shell blasting – a practice used for cleaning the intake valves of vehicles. Before writing it, I looked for a keyword using Google’s old Adword Keyword Tool. “Walnut Shell Blasting” had high competition, while “What is Walnut Shell Blasting” had very low competition.
Long story short, I added the “what is.” Now you can find my Ezine article about walnut shell blasting at #1 on Bing. I imagine if I posted the piece on the company’s blog instead of on multiple article directories, it would have been close to #1 on Google, too. However, as you probably know, Google has very strict duplicate content rules.
Gardner’s experience: A while back, Gardner wrote a post on email marketing – its definition, how people use it, etc. Like me, before writing it, he did some research and found that he had a better chance ranking if he added “what is” before “email marketing.” As he expected, Google rewarded him with highly targeted traffic.
When Gardner wrote his article on long tail keywords, he noted that “what is email marketing” ranked #14 on his keyword referrals list for Google Search. Pretty impressive.
According to Gardner, “the majority of searches performed are of the long tail search variety. Rather than typing in a generic word or two and sifting through pages of results to find what they’re looking for, searchers are much more likely to type in longer phrases to immediately find the specific information they need.”
Evidence that the use of long tail keywords is growing: SEO companies like WebiMax are focusing on long tail keywords’ enormous potential for highly targeted traffic to increase rankings for new and existing clients.
Imagine the online recognition that could be achieved by combining long form content with long tail keywords.
Vast like the abyss. Awesome like a home run.
Choosing keywords for your business’s website can be tricky. But it’s something you have to do if you want people to find your business on the Web. I always like to take some time to learn about the site and understand what it will take for someone to get there. It’s good to put yourself in the searchers shoes and think about all the different words they may type into a search engine in order to get to where they want to be on the Web. In this week’s blog post I would like to briefly discuss some of the different aspects surrounding keyword research.
Broad and Exact Match Volume
Before I explain why one should stay away from keywords with low or no search volume, I should explain exactly what the search volume means.
All search volumes are usually associated with a number, which is the average amount of searches for that term in a month. Most people focus on two types of search volume: broad and exact. I prefer exact.
Broad volume is a collection of all searches involving any mixture of the keywords as well as other words not accounted for. This means the search volume is usually overestimated. The exact volume, on the other hand, is a much simpler number and should normally be one of the main factors when selecting the right keywords. The exact volume refers to the average number of searches for that specific keyword, exactly how it’s spelled with no other words added.
If a term shows no or extremely low volume, then it is pointless to use the term. That means no one is searching for it and if no one is searching for it, no one will be getting to the site using that term.
Of course, some businesses only want to compete for keywords within their own geographic location. A bagel shop in Philadelphia, for example, doesn’t want to compete for visibility with everyone that’s going after the term “bagel shop.” They just want to target people looking for bagel shops in Philadelphia.
Adding a location to a keyword helps tag that term to a specific area. In the example of the bagel shop, it would be in their best interest to add “Philadelphia” to the term “bagels.” Furthermore, an individual may be for a place to buy bagels in the area and could use “Philadelphia area” or even the slang term of “Philly.” Optimizing a site for all these different localized terms would benefit their business. Obviously, if a business is not local, it would be wise to not add the localization to the keywords in mind.
There are a few key factors to consider when determining keywords. I like to look at the exact volume, broad volume, difficulty score, current Google rank and current Bing rank.
I get my keyword difficulty score from the SEOmoz’s Keyword Analysis tool. This number determines how challenging it is to rank for a keyword. The score is calculated by analyzing the domain and page authority of the top 20 results in Google for the entered term. I like to choose terms that are either 50% or below. These are moderately competitive. Typically, if the search volume is high for a specific keyword, and the site already ranks for it, you can choose a term that exceeds a difficulty score of 50%.
The other two determining factors are Google and Bing rank. These are pretty self-explanatory. They are the current positions that the keyword sits on for its corresponding URL. If the site already ranks for a specific term, then there is a chance of it moving up in the results.
Like I said earlier, the first thing I do when choosing keywords is I get a feel for the site, what type of site it is, what its content is about, what they sell, how they sell it, what the site wants to get out of it and so on. Knowing what the top services or products sold is always a plus. Get an understanding of the location – local, national or global.
Once I have all this information gathered I start writing down different possibilities of keywords. Once I have a few different terms I like to check out the site’s positions through SEMrush to get more ideas. Here you can see the organic research and most terms that the site is already ranking for, sometimes not all but still very helpful. The tool also can show competitors and what terms they are ranking for. I then add the terms that I feel are relevant and add them to my ideas list. Once I have a few ideas, I take them to the Google Adwords Keyword Tool.
When in the AdWords Keyword Tool, I type in all my keyword ideas. Normally I select the location as United States, but that could change depending on location. I will check off Exact and Broad volumes and then I search. Most times, l use the “Keyword ideas” tab and not the “Ad group ideas.” From there I save all of my searched ideas and then I will scroll down to get other ideas that Google suggests. Most of these suggestions can help discover new terms because they are closely related to the original searched terms. Once I mark off all the terms I feel fit, I download and export to an excel sheet where I gather all my data.
After ruling all the terms with no or almost no search volume, I find out the keyword difficulty score. Once I get all the difficulty scores, I like to rule out all the scores that are high – preferably anything above 50 but often 55 is my limit. It is important to choose terms that have a high exact search volume because it is proof that users are searching for this exact term across the Web, and that is what’s needed most. Check all terms to determine the position and whether or not they are ranking in Google and Bing. These can help determine whether or not a high difficulty scored term has a good chance of ranking.
Once I have all these determining factors I line them up nice and neat in a spread sheet, using filters to sort the different metrics. It is important to pay attention and use good judgment when selecting these terms by comparing all the different statistics that have been compiled. Most importantly, keep an eye out to make sure none of the good opportunity terms get missed.
All in all, once all the finalized keywords are selected, the rest is history from there. Optimize your site to the best of its ability by utilizing these keywords in the content on-site, links on-site and even blogs and links off-site. Use them in the Title tags, meta descriptions and most on-page elements but avoid stuffing. After all, that’s the point of all this research right?
For the last twenty years, numerous SEO companies and internet advertisers have depended on keywords as being a guiding light for search engine indexers and site crawlers. A tactic commonly used by ethical and unethical online marketing agencies alike, heavily emphasized keyword implementation was so pervasive throughout the web development community that almost everyone has come to rely on it. Of course, this all started to change with the arrival of Google’s Panda updates as well as the recent release of Penguin. Now, webmasters are looking for ways to remain relevant to Google and other search engines while revising their own operations.
Smart Keyword Use: Only When Necessary
As Google made clear in its original announcement of Penguin, high-quality content is at the top of the company’s desired SERP content. The implications of this demand for engaging webpages is many, but in this case we’ll focus on the greatly reduced effect of what is known as “keyword stuffing.” This practice describes the rather unscrupulous behavior of repeatedly using key phrases and terms in order to game a search engine and artificially strengthen their relevance to the page or site in question. In the past, too many marketing agencies would repeatedly stuff their clients’ online properties with keywords, but these days search engines have become smart enough to know the difference between spam and good content.
As a result of this, everyone needs to get on the same page (pun not intended) as Google and emphasize the importance of interesting and unique content over questionable optimization methods. Although the world’s biggest search engine still uses keywords to categorize and archive pages, the repetition of a key term throughout a page means that Google’s search algorithm now regards it as having a low value. As a result, business owners and webmasters should use focused keywords only as needed.
Keyword Limitations Lead to Quality Content
While being forced to use a keyword conservatively may sound like a hassle, the fact is that it actually yields a number of benefits. For one thing, putting a limit on one’s keyword use leads to content that is fresher and also more interesting to read. Content writers should also use the situation to explore more interesting and more varied topics. For example, a keyword such as “car engines” may be the focus of a page, but that doesn’t mean that the content needs to be all about that term. Instead, users can choose to focus on the way engines work in classic automobiles versus top-of-the-line racing cars or other topics.
Even though the new obstacles set forth by the Google Penguin and Panda updates may be a thorn in some SEO developers’ sides, it’s really just another way to motivate website and blog owners to create content that users will read and maybe even share. For further advice regarding how to use keywords in this post-Penguin world, I can be contacted at firstname.lastname@example.org.
Tomorrow marks one month since the debut of Google’s Penguin, and everyone is still trying to make sense of the update. While a couple websites have come into existence as a response to Google’s most recent revisions to its algorithm, little has been made in the way of progress when it comes to actually puzzling out how Penguin works. Although it is highly unlikely that the company will ever reveal the mechanisms behind the update, the SEO community has at least come up with a few useful tips for creating content that Google’s search engine wants.
Don’t Repeat Content
Regardless of whether a website is reposting text from elsewhere or pulling materials directly from other online properties, businesses should take care to never duplicate or steal content. Aside from the obvious moral implications inherent in this sort of black hat SEO activity, content reposting is one of the many optimization behaviors that Penguin explicitly punishes. Every day, Google crawls millions of websites and then compares what it finds against existing pages. Should content be discovered as having been taken from elsewhere, then that page’s rankings and SEPR rankings are dropped as a result.
In order to avoid this issue, webmasters and business owners need to remain consistent in their efforts to create unique content for their pages. Since Google’s intended goal is to create a network through which users can find useful or interesting sites quickly and effortlessly, 100% original work is far more likely to show up in the company’s SERPs than copy-pasted material. As such, every website owner should create their pages from scratch or from personal templates when generating content.
Only Write when It’s worthwhile to the Reader
Mismanaged SEO campaigns often end up creating mountains of blogs, social media posts and webpages that lack any true substance. In an ongoing mission to secure popular keywords and gain additional indexing opportunities, many people blunder in their SEO efforts and simply generate new content that holds some SEO value but offers nothing of worth to potential readers or site visitors. Although this sort of content may initially get picked up by site crawlers and show up on search results, a lack of user engagement means that it will only end up falling down the SERPs rather quickly.
In order to create webpages and blogs that retain decent rankings and respectable traffic, businesses need to provide users with content that is worth interacting with or sharing with others. Well-written and informative pages tend to be the same ones that appear to users in their initial search results most often. While it may be tempting for a company to arbitrarily create new pages and blogs for link-building purposes, business owners should make certain that any new content made for them is interesting and engaging.
For additional information about creating quality content post-Penguin, I can be contacted at email@example.com.
A few days ago, Google unveiled its newest search feature, Knowledge Graph, to network users. For those readers who are unfamiliar with the announced program, Knowledge Graph is being launched as a sidebar addition to the company’s search engine results. While the company’s SERPs will stay the way they’ve always been, the new feature will serve up interesting facts, details and relevant information for popular keywords entered in queries. Google has stated that it has plans to bring the function to mobile platforms in the future as well.
Although Knowledge Graph has yet to become available to all Google users, numerous SEO companies already have their own stance on how the feature may affect the current state of online marketing. In the recent months, Google has been responsible for some of the biggest and most impactful changes in internet advertising. As a result, everyone in the SEO community is keeping a watchful eye on the company and will be for some time to come. While marketing agencies will let their clients know about important news, business owners should still keep aware of these latest happenings in order to actively improve their web prevalence.
What Should SMB Owners Take Away from Knowledge Graph?
Aside from the supplemental nature of Knowledge Graph, Google’s newest network feature also gives us some insight into the sort of trending page elements that the company regards in high importance. Between the information-based focus of Knowledge Graph and the strength of Wikipedia in the company’s SERPs, one can see that Google wants more informative sites these days. Yet while this realization is made readily apparent through Google’s recent efforts, not everyone is taking advantage of this fact.
Creating Quality that Search Engines Want
Many of the WebiMax blog readers are small or startup business owners who are looking to get their online properties well-represented on every engine’s search results. Although Google’s ranking trends are not entirely indicative of what other search engines are looking for these days, the company does tend to set the pace for what is seen on most SERPs. If anything, it’s a safe bet that the same sort of informative content that Google’s search algorithm finds desirable will rank well on competing engines.
In order to have better traction in the SERPs, more businesses need to work on creating content for their online properties that is not only informative to readers but also interesting. While not every page of a company’s website may have space for this type of content, a business should always devote some time to creating it where it can. Often times, company blogs and user-maintained pages act as hubs for news and information that readers will find engaging. Other venues for this type of content may include employee sites that focus on related topics and are linked to the aforementioned blog.
While it’s still uncertain where Knowledge Graph will eventually lie in Google’s overall business plan, there are still several useful conclusions that can be drawn from the new network feature. Should readers have any particular questions, I can be contacted at firstname.lastname@example.org.