Follow These Steps to Help Remove your Manual Action from Google for Unnatural Links Pointing to your Site
Nowadays, you hear a lot about Google placing manual actions on websites due to the unnatural links pointing to them. These manual actions can really cause a site to suffer and the removal of them can be a tedious process. One of the mistakes I often see websites make is to continue building links after they receive this penalty. At this point, all link building efforts should stop and the main focus should be taking the time to revoke the penalty. Please keep in mind that though this process takes time — it cannot be done overnight or in a couple days — it’s important because Google wants to see that you made the effort to recover from the penalty.
Step 1: Compile all of your external backlinks. You can download all of your links in Google WMT, but I also recommend using additional sources to compile them. Other tools I like are Ahrefs and Moz’ Open Site Explorer. Just make sure that you remove all duplicate links once you have a compiled list. A lot of the time, these backlink profiles will hold two of the same links: one that has the www-version and one that has the non-www version. I recommend removing “http://” and “http://www.” by using Excel’s “Find and Replace” function. Removing these from the URL’s and then removing all duplicates will save you time when later analyzing these links.
Step 2: Analyze the compiled list of links, highlighting all of the bad ones. This step can take some time, considering how many links make up the entire link profile. Additionally, each link needs to be thoroughly investigated (you cannot just look at some of them). Keeping everything organized, analyze each link from the top of the list down to the bottom, highlighting all the links that Google deems unnatural. The best way to learn Google’s definition of “unnatural” is to look at their considered Link Schemes in their content guidelines, found here:
Use this as a guide to determine which links are good and bad. I have learned that links that use exact match anchor text are often the most targeted. A lot of times, Google will go after a website’s most linked keyword and you will notice its rankings start to tank. The majority of links surrounding these keywords usually need to be removed.
Step 3: Make a strong effort to remove the bad links and document it. Google wants to see that you have dedicated a large amount of your time to removing the bad links. This is a manual process that consists of e-mailing the websites that these bad links originate from and asking that they be removed. Use WhoIs.net to look up the e-mail address associated with the domain in question. Often the “WhoIs” will be private and you will need to search the site from top to bottom to find an e-mail address or a contact form.
Politely inform the webmaster of the situation that you are currently in, explain the location of the links and ask for their help in removing them. This entire process needs to be documented, showing the e-mail address associated with the bad link, the date the message was sent and the response (if any) that was received. This outreach should be repeated two or three times, documenting all efforts to get these links removed so that Google can see all the steps you took to resolve the issue.
Step 4: Disavow all the links that you were unable to remove. About a year ago, Google launched a great feature that allows you to disavow links to your site. Basically, this file makes the links invisible to Google’s eyes, almost akin to turning them into no-follow links. Be careful, though! Google does not want you to abuse this feature and it should only be used for links you have no control over and are unable to remove. This file must be submitted as a .txt file, and should be correctly formatted. I prefer disavowing whole domains rather than links, as most of the time the unnatural links are found on spammy websites and it is just easier to disavow the entire domain. Google even allows you to leave notes in the disavow file, explaining the different situations that accompany each link or domain. You can describe how the following domains requested payment for removal or how there was no contact information, etc. See the example below:
Take all of the links that you were unable to have removed and add them or their domains into the file as shown above. Filtering by the different responses you have collected from your outreach process will help categorize the different links.
There has never really been any documentation on how long it takes for this disavow file to go into effect, but I recommend waiting a week before moving on to the next step — submitting a reconsideration request.
Step 5: Write and submit your letter of reconsideration. This is the final step of the process. For manual actions, you need to “request a review” in Google WMT so that your site can be reviewed by Google’s Quality Search team, who will determine whether it meets their guidelines or not. This letter needs to convince Google that you have done everything in your power to clean up the link profile and meet their quality guidelines. They want to see documentation of all your efforts and everything you did and are currently doing to overcome the linking penalty. The following are a few key points that you will need to include in your reconsideration request:
- Take full responsibility for the penalty, explain what caused it to occur, admit to what you have done in the past, and tell Google what you are doing now to treat the issue. It is your website and you should be the one taking the blame for the penalty. You can throw your SEO under the bus, but you should still take responsibility for their work. Did you ever pay for links in the past? Google wants you to include this type of information in the request.
- Include documentation of your outreach efforts. Rather than just saying you have e-mailed a handful of webmasters for link removal, provide documentation of the efforts you made when contacting the webmasters. Put all of this into a Google Doc and link to it in the letter. Make sure the document lists all of the bad links, the webmasters’ e-mail addresses, the different dates that messages were sent and the responses that you received.
- Discuss your disavow file. It is a good idea to talk about the disavow file that you uploaded and it doesn’t hurt if you choose to link to it in a Google doc, as well.
- Make it known that you are strongly committed to following Google’s Quality Guidelines and that your site is meant for users, not search engines. Explain that from now on you are going to follow the rules and that it is your goal to create unique and compelling content that is beneficial to users.
Once you have compiled all information and proof of your efforts through the process, you are ready to submit the letter. These requests are not evaluated by robots — they are read by humans, so make sure to use a pleasant tone in the letter. The time it takes for Google to respond varies, but based on my experience, it normally takes about one to two weeks to get a response. Once you’ve sent the letter, it is time to play the waiting game. Good luck.
Most of you have probably heard that Facebook recently unveiled the capability for users to enjoy clickable hashtags on their platform, just like Twitter and Instagram. This is a huge step for Facebook, allowing them to better organize public conversation.
Hashtags are words or even phrases that are tagged with the number or pound symbol (#). They are used heavily on Twitter as well as other social media sites as a form of metadata tag. The first ever hashtag used on Twitter was from Chris Messina in 2007, “How do you feel about using # (pound) for groups. As in #barcamp [msg]?” The #barcamp hashtag was intended to bundle conversation about the Global Technology Unconference gatherings called Barcamp that Messina helped found. Since then, these hashtags have been selling like #hotcakes…so to speak.
The use of hashtagged terms has grown so much in popularity over the years. And, like most things that gain a lot of exposure – the Kardashians, Tan Mom, etc. – there has been some backlash. The use of hashtags on Facebook has even enraged some Facebook users to create a page dedicated to the hatred of hashtags. The page is called, “This is not Twitter. Hashtags don’t work here.” I find this somewhat comical and don’t really see what the fuss is all about, but it seems like the creator of the page is very serious. Users feel as if Facebook is stealing ideas from Twitter, but they are just staying up-to-date with the trends of social media.
How it Works
The way it works is basically just like Twitter, but Facebook can be more private than Twitter. You can either search for, or click on, a hashtag; and when you do so, it brings up a screen with all the recent posts containing that hashtag. However, all of these posts are public posts. If you are not friends with a specific person, their profile posts are only able to be seen by friends.
From a marketing standpoint, it is wiser to allow all Facebook users access to your hashtagged posts. This is a great business and marketing tool, especially when one is trying to gain visibility in a specific market. Your business can also gain the reputation of providing fresh and engaging content when your Facebook page is joining different conversations about a trending topic.
What’s Next in Line
- Trending Hashtags – Twitter and other social outlets have feeds of trending hashtags but Facebook currently does not.
- Hashtag Advertising – Advertisers do not have the option to buy Hashtags at the moment, but that is a strong possibility in the near future.
When’s the last time you actually looked up a business in a phone book? It’s probably been years for me.
Whether I’m looking for a plumber or a pizza place, I turn to Google (sorry, Bing). It’s faster and more efficient. Within seconds, I can find the local businesses I’m looking for. And not only can I get their contact information, but I can read reviews, view images, and get directions for that business.
Naturally, this means local businesses face a new challenge: building an online presence. With people like me turning to search engines, it’s important that local plumbers and pizzerias appear in those results pages. Otherwise, people are going to take their business elsewhere.
Luckily, though, the challenge isn’t as daunting as you may think. There are a number of tools on the Web to help local businesses boost their search engine visibility.
Here are two of my favorites.
Yext Power Listings Plus is a unique tool that can help add rich content that answers the “who, what and the when” of any business. Yext allows businesses to add dynamic product listings, professional bios, event calendars and more to their listings on over two dozen sites. Everything gets controlled from a central account. When a listing is displayed it has the option to give deeper results. It can help include a menu to a restaurant, different products or services businesses offer, calendars of all types and biographies of individuals working at an office such as a doctor, dentist, chiropractor, etc. Rather than just making the listing, this product helps explain the listing. It is imperative to have rich content associated with a business listing. This enhances the user experience by delivering more information to your visitors. You want the listing to be as helpful as possible. You can maintain the information of all of the listings from one location on Yext.
Whitespark offers a local citation finder. It hunts down citation sources all over the web for you after you submit a form with the company information. I like to use Whitespark more and more as a research tool, shedding light on where the competitors are using their citations and knowing where other citations are needed. The tool requires the country, state, city and a main keyword relating to the business. Once you submit the information, it will search for as many citation sources you can get your company listed on. Once the search has ended you can go ahead and see all the other sites that are ranking for that specific keyword and see where they have their site’s listed. It will show all sites that are ranking on the first page for that localized key term. You have the option to see all the citations that those sites’ currently use. It will display the type of site, the Domain Authority, and whether or not your site is currently listed there. The coolest thing about it is that it gives you the option to submit the URL to your site or even submit the entire business’s information. All in all, Whitespark gives you the option to see where all the top ranking competitors are getting their sites listed.
These two tools should help get your local business a nice jump start. Hopefully with these two tools, you can start to build your online presence and no longer rely on the hopes and prayers that people find you in the phone book.
Choosing keywords for your business’s website can be tricky. But it’s something you have to do if you want people to find your business on the Web. I always like to take some time to learn about the site and understand what it will take for someone to get there. It’s good to put yourself in the searchers shoes and think about all the different words they may type into a search engine in order to get to where they want to be on the Web. In this week’s blog post I would like to briefly discuss some of the different aspects surrounding keyword research.
Broad and Exact Match Volume
Before I explain why one should stay away from keywords with low or no search volume, I should explain exactly what the search volume means.
All search volumes are usually associated with a number, which is the average amount of searches for that term in a month. Most people focus on two types of search volume: broad and exact. I prefer exact.
Broad volume is a collection of all searches involving any mixture of the keywords as well as other words not accounted for. This means the search volume is usually overestimated. The exact volume, on the other hand, is a much simpler number and should normally be one of the main factors when selecting the right keywords. The exact volume refers to the average number of searches for that specific keyword, exactly how it’s spelled with no other words added.
If a term shows no or extremely low volume, then it is pointless to use the term. That means no one is searching for it and if no one is searching for it, no one will be getting to the site using that term.
Of course, some businesses only want to compete for keywords within their own geographic location. A bagel shop in Philadelphia, for example, doesn’t want to compete for visibility with everyone that’s going after the term “bagel shop.” They just want to target people looking for bagel shops in Philadelphia.
Adding a location to a keyword helps tag that term to a specific area. In the example of the bagel shop, it would be in their best interest to add “Philadelphia” to the term “bagels.” Furthermore, an individual may be for a place to buy bagels in the area and could use “Philadelphia area” or even the slang term of “Philly.” Optimizing a site for all these different localized terms would benefit their business. Obviously, if a business is not local, it would be wise to not add the localization to the keywords in mind.
There are a few key factors to consider when determining keywords. I like to look at the exact volume, broad volume, difficulty score, current Google rank and current Bing rank.
I get my keyword difficulty score from the SEOmoz’s Keyword Analysis tool. This number determines how challenging it is to rank for a keyword. The score is calculated by analyzing the domain and page authority of the top 20 results in Google for the entered term. I like to choose terms that are either 50% or below. These are moderately competitive. Typically, if the search volume is high for a specific keyword, and the site already ranks for it, you can choose a term that exceeds a difficulty score of 50%.
The other two determining factors are Google and Bing rank. These are pretty self-explanatory. They are the current positions that the keyword sits on for its corresponding URL. If the site already ranks for a specific term, then there is a chance of it moving up in the results.
Like I said earlier, the first thing I do when choosing keywords is I get a feel for the site, what type of site it is, what its content is about, what they sell, how they sell it, what the site wants to get out of it and so on. Knowing what the top services or products sold is always a plus. Get an understanding of the location – local, national or global.
Once I have all this information gathered I start writing down different possibilities of keywords. Once I have a few different terms I like to check out the site’s positions through SEMrush to get more ideas. Here you can see the organic research and most terms that the site is already ranking for, sometimes not all but still very helpful. The tool also can show competitors and what terms they are ranking for. I then add the terms that I feel are relevant and add them to my ideas list. Once I have a few ideas, I take them to the Google Adwords Keyword Tool.
When in the AdWords Keyword Tool, I type in all my keyword ideas. Normally I select the location as United States, but that could change depending on location. I will check off Exact and Broad volumes and then I search. Most times, l use the “Keyword ideas” tab and not the “Ad group ideas.” From there I save all of my searched ideas and then I will scroll down to get other ideas that Google suggests. Most of these suggestions can help discover new terms because they are closely related to the original searched terms. Once I mark off all the terms I feel fit, I download and export to an excel sheet where I gather all my data.
After ruling all the terms with no or almost no search volume, I find out the keyword difficulty score. Once I get all the difficulty scores, I like to rule out all the scores that are high – preferably anything above 50 but often 55 is my limit. It is important to choose terms that have a high exact search volume because it is proof that users are searching for this exact term across the Web, and that is what’s needed most. Check all terms to determine the position and whether or not they are ranking in Google and Bing. These can help determine whether or not a high difficulty scored term has a good chance of ranking.
Once I have all these determining factors I line them up nice and neat in a spread sheet, using filters to sort the different metrics. It is important to pay attention and use good judgment when selecting these terms by comparing all the different statistics that have been compiled. Most importantly, keep an eye out to make sure none of the good opportunity terms get missed.
All in all, once all the finalized keywords are selected, the rest is history from there. Optimize your site to the best of its ability by utilizing these keywords in the content on-site, links on-site and even blogs and links off-site. Use them in the Title tags, meta descriptions and most on-page elements but avoid stuffing. After all, that’s the point of all this research right?
The Visual Website Optimizer tool is a great addition to have on any site. It can help you increase conversions, sales, signups and any other goal a site holds. The tool can help increase these goals by the integrated use of A/B Testing. Well what is A/B Testing? A/B Testing can compare how effective different versions of a certain test can be. Whichever version turns out to have the better results, then that specific version will be selected as the one to run with.
The way the optimizer works on a website can be very cool. You will have the option to create and display two or more different versions of a specific web page. Once created, users will approach the same page but with varied layouts. Images, colors, headers, etc. will be altered to judge which layout works best. Data will be collected depending on what the user does when they are on that specific page. Whichever data shows a certain page with a better conversion, then that page will be selected as the main page to use. The editor which changes the headers, titles, images, and buttons and so on can be easily altered through the tool’s easy point-and-click interface. Did I mention that ZERO knowledge of HTML is required? You do not need to have any knowledge on HTML and any changes that are made on the website are very simple and easy to perform.
What Changes Mattered
Through the powerful multivariate testing you can easily identify which specific changes that were made on the website increased conversion and which changes did not help at all. Rather than just showing which page performed better, it will display exactly what it was on the page that caused it to perform better. It could be something simple like the color of a button.
You can keep track of revenue on the site and track all conversion goals. The reporting is always one of the most important factors because without it, then how would we know which version or layout of a site is most effective. You are able to get real-time data and real-time reports of all the test performances. You can view daily performance charts and then also look at performance trends that cover a span of time.
Behavioral & Geo Targeting
You can even go a step further and personalize your website to focus on your marketing strategies. You have the option to customize your web page rendering to each visitor to increase the sales. With the combination of multiple different targeting options, you can show a specific web page to a visitor that performs best for them. For example, if you want to show different content to different visitors, then the tool will measure their location, operating system and even the source from where they came from whether it is Google or Facebook and so on.
Any One Can Use It
This tool can help increase the chances of any type of conversion on any website. Through its great A/B Testing process, one can learn so much about the different possibilities of a web page. To add to all these amazing aspects of such a great tool, if anyone wanted to see different situations of how this tool works, they can see all case studies here. The Visual Website Optimizer has been used by some of the biggest names out there on the market like Microsoft, Hyundai, Disney and so many more. I highly recommend this tool to anyone that is trying to increase their conversion on their website, whether it be an e-comm, informational, form or any type of sale. A/B Testing is a great way to test out marketing strategies in the market today.
A special shout out goes to Nick Eubanks for his presentation of this tool at ShameOnUX.