As our own Chris Countey reported just moments ago, Google’s Matt Cutts announced at #PubCon in Las Vegas earlier today that the world’s most prominent search engine is releasing a “Disavow Links” feature within Webmaster Tools.
Formerly, the removal of damaging, low quality links from Google’s consideration was a tedious and often difficult process. Going forward, removing such links (once they are located) will be a much less complex procedure and the tool is expected to be one of the most useful resources in the digital marketing industry.
The Disavow Links page, which is now live within Webmaster Tools states:
“If you believe your site’s ranking is being harmed by low-quality links you do not control, you can ask Google not to take them into account when assessing your site.”
The tool will utilize a new “disavow.txt” file, which will work similarly to the long-standing “robots.txt” and specific pages containing links (or even entire domains) can be added to the file and subsequently eliminated as a ranking factor.
While this tool has been rumored for quite some time and even hinted at by Cutts since July, the official announcement and debut of the new resource is a major leap forward for SEOs and is certainly going to impact the industry for the foreseeable future.
When business owners hire search engine optimization companies, they do so because they want to get their website ranked higher on Google or Bing. However, obtaining the conversions you want goes beyond landing on the first page of the SERPs. People who click the link to your site then have to be compelled to buy from you. It’s at this point where we seriously need to discuss content.
No one likes to hear that the content on their site isn’t very good. It can be a difficult subject to discuss. However, SEOs have a duty to be trustworthy partners when working on a site and that includes making content suggestions. All the meta data can be pristine, but if what the user sees on the page sends up red flags (misspellings, broken links, outdated information) they will start to feel skittish and will probably move on to another site that looks more trustworthy. Remember, even in 2012 people get uncomfortable giving out their credit card information, even when security has never been better.
Business owners need to consider giving their content an overhaul as part of their SEO efforts. Content that is engaging, modern looking, and welcoming is what will convince visitors to open their wallets and become customers. Of course, a great product is important. But visitors to your site must be persuaded to believe that your site is the best place to purchase this product from. Even if you’ve written your content yourself, it might be in your best interest to have a professional copywriter look it over or rewrite it for you.
It doesn’t take an online marketing company with a decades’ worth of experience to see that the relationship that SEO service providers have with Google is a complicated one. Of course much of this is owed to the recent string of Penguin and Panda algorithm updates (of which we discuss quite often), but there are also several other key aspects of the world’s biggest search engine company that makes our jobs equally challenging and exciting. Take for example the constant addition and removal of network features for Google’s numerous services.
Feature Report and Secrecy
Just this week, in fact, Google announced several changes will be made in the near future to a host of network functions. While these features include only a few select aps that generally aren’t being used all that often (don’t expect to see Maps go away, for example), this is still part of an ongoing trend with the company. It’s also a habit of Google’s that Microsoft even openly mocked earlier this year.
Aside from the constant adding and dropping of support for its applications, Google also tends to be incredibly secretive with everything it does. This is especially true with anything regarding the inner workings of its search engine. Although we shouldn’t expect the company to unveil how its algorithms work lest a new age of black hat SEO emerge, it still makes understanding what Google’s search engine deems worthy of high page rankings quite the task. Matt Cutts, Google’s head of its webspam division, does offer vague advice from time to time on his blog and on the Google Youtube account, but what constitutes a perfectly optimized page is still mostly conjecture at the present.
The SEO Back-and-Forth
Between Penguin and Panda, the apps, and the vague demands made of SEO companies, there are many ways that Google consistently impacts what we as online marketers and SEO consultants day-in and day-out. These days, the top SEO companies around are those that continue to hang on the company’s every word and attempt to predict its actions. While no one can know for sure what Google intends for the future, most of us do realize one important thing: staying ahead of the competition means constant innovation and revision.
Two days ago, Neil Young joined Twitter. Nearly 26,000 users are already following him, including me. So what did the massively influential singer-songwriter have to say as his first tweets?
As a fan, this was a huge disappointment. No insights into his famously introspective lyrics or deep explorations of his diverse musical styles? Not even close. Instead, some generic promotional drivel – clearly, this is not really Neil Young but rather a publicist or marketer who is endorsing his brand.
An article about Neil Young’s deceiving Twitter tease on GigaOM.com focuses on whether the fake hype is reason enough for social media companies to start flagging the difference between brand and personal accounts. It also raises the question of whether famous people’s promotional accounts will be excused if Twitter decides to ban fake names.
The most questionable part of the fake hype over the Neil Young Twitter account to me, however, is the boring content the marketer has published so far. As a former Content Development Team Lead at WebiMax, and as the current Social Media Manager, one thing I always stress is the importance of understanding your audiences and writing with them in mind.
Whether you’re a content marketer writing a blog post for a client or a social media marketer composing a tweet for a campaign, give the readers what they want. In this case, fewer dull marketing posts and a little more “Rockin’ in the Free World.” Neil Young’s distinctive guitar sounds and signature voice deserve it – and so do his fans.
I asked our content writers and social media marketers to come up with ideas for Neil Young tweets, and within minutes here were some of the best responses:
1. If I could tweet this through a harmonica I would: the @rollingstone exclusive premiere of new video Walk Like a Giant http://bitly.com/QSj1hD
2. While @jimmyfallon did a great job impersonating Neil Young for his show, Young’s lyrics are more thoughtful than this: http://bit.ly/S3bgtW
3. On 8/25 NBCNews.com mistakenly reported Neil Young dead instead of Neil Armstrong. Alive & well – album out 10/30 #RIParmstrong
4. Neil Young never walked on the moon, but now he’s on Twitter. That’s equivalent, right?
5. #MentionADateYouWillNeverForget 11-12-1945: The day of my birth! Read more in new memoir: http://bit.ly/OXWMun
Basically, Neil Young should hire WebiMax to manage his Twitter campaign. Which is your favorite tweet? Let us know in the comments below.
How many of you have friends that don’t use their real name on Facebook? Perhaps you do this yourself. If you’re like most people, a decent portion of your friends list has people swapping out their last names for their middle names. This all seems harmless enough, right?
Well, if you didn’t know, Facebook is rather unhappy with all of you using fake names! That’s because the less they know about their users, the more difficult it is for them to monetize their services. Social media companies have successfully leveraged Facebook to obtain sales for their clients, but Facebook itself is at a difficult crossroads when it comes to bringing in revenue.
Facebook’s goal is to be your official identity on the web. Eventually, they would like to be the main hub through which you do your banking or any other official business, including voting. However, this is impossible if people are using anything other than their real name. If the name you use on Facebook varies from the one on your credit card, this hurts Facebook’s business model. If advertisers can’t get an accurate reading on you and users like you, what good is Facebook to them?
While it’s Facebook’s official policy that you must use your legal name, it’s impossible to track, especially if your alias sounds like a real name. Currently, it is believed that 83 million Facebook users are spam accounts or duplicates made by people who want to keep certain aspects of their online lives private. It will be interesting to see in light of how poorly Facebook has done since its IPO how they’ll be able to turn things around and get the confidence from advertisers they so desperately need.
7:08pm – Chis Countey speaking toward Google Authorship
7:12pm – “Google is the referee in the world of SEO”
7:12pm – “Deliver quality content, or it will be susceptible to algorithm changes.”
7:14pm – “If your competitors are optimizing for certain keywords and you’re not, you’re left out.”
7:14pm – “If your domain isn’t even indexed anymore, you have a LONG way to go!”
7:14pm – “Banned domains, unnatural link building, Panda update, and the Penguin update are all working against you if you’re conducting black-hat SEO.”
7:15pm – “On-site SEO Playbook: Maximum Indexing Crawling, Unique Page Titles, URLs, File Names, Code Architecture, Microdata, Authorship, and Trust signals and Page Speed.”
7:17pm – “If you don’t put in a Meta Description, Google will decide what should go there. Take away their ability to assume.”
7:20pm – “What are you competitors doing off-site? We MUST make sure we are building links the proper way. Will other people want to read this?”
7:21pm – “Having your business listed in credible directories, such as Chamber of Commerce, is fine. Directories such as 123business is wrong!”
7:25pm – “Check out Ian Laurie on Twitter, great insight and knowledge and you’ll laugh all day.”
7:26pm – “Ross Hudgens @RossHudgens is another valuable source for link building and competitive analysis.”
—- Bill Slawski takes the stage —-
7:28pm – “I told my parents I was here to talk about ‘crawling’ tonight — they didn’t quite get it!”
7:28pm – “There are 3 main aspects to a search engine. It crawls, indexes them, and shows them to searchers.”
7:28pm – “Robots when they first came out, people started writing programs to interact with them and abuse them.”
7:30pm – “Martijn Koster developed the Robots.txt protocol.”
7:31pm – “Important web pages have a high backlink count, have a high PageRank, and are in or are close to the root directory for sites.”
7:34pm – “Most crawlers will not only be Polite, but they will also hunt down important pages first.”
7:35pm – Search Engines filed patents on how they might crawl and collect content found on Web pages.”
7:38pm – “IBM came out with a patent for link merging to distinguish between the amount of different backlinks on a webpage.”
7:39pm – “If they see content structures, they might say ‘this is different, we may count as 2 independent links as groups’.”
7:40pm – “We know Search Engines crawl, we don’t know how well they crawl.”
7:40pm – “Search Engines like it when there is only 1 URL per page and you can set this up to be different URLs per page. This makes a difference!”
7:42pm – “What happens when you have an E-commerce page with over 50 links per page? The Rel=”prev” and rel=”next” page is here to help associate these pages together.
7:44pm – “This also works well for article pages, an article that is broken up amongst multiple pages. This tells Google these pages are together.”
7:45pm – “The Rel=”hreflang” tells the search engine that these are the same pages, they’re just in different languages. We see this on multilingual pages including FedEx, UPS, United Nations, and so on.”
7:49pm – “XML Sitemap’s are a way to create alternatives for search engines to crawl web pages. These can be discovered without having the crawler physically crawl your web pages.”
7:50pm – “Use Canonical links, remove 404s, Validate with an XML Sitemap Validator.”
7:50pm – “Crawling versus XML Sitemaps. Google said they were discovering content on webpages faster with XML’s versus crawling.”
7:51pm – “Yahoo came out with a patent not too long ago saying ‘we’re going to crawl social media’.”