Going from Google Penalty to Reconsideration

Google Penalty letterA website owner responded to our advert in panic leaving  several phone messages and email contact requests. He woke to receive the dreaded email from Google informing him his site has been penalised due to unnatural linking behaviour  If you have been doing some aggressive link building in the past then you may get a similar email telling you your site has been de-listed for suspicious behaviour. The question is what are the steps one should take to get out of the Google jail.

Below is the steps I took to go from the warning letter to a reconsideration request.

Download Backlinks

Find out what links could be causing the problem. Now, the problem here is that Google has been notoriously secretive at telling us what links it counts for our site. A Google search for site:yourwebsite.com shows very few links so we have to rely on other sources like Open Site Explorer and MajesticSEO to find our backlinks. Matt Cutts in his disavow tool launch video suggests that the warning letter comes with sample links found to be unnatural but there was no such hint in our clients email. You do get more backlinks information from webmaster tools though you have to dig a little. You have to jump over to your webmaster tools traffic tab then click on more tab under the “who links the most”. This will give you a list of all sites linking to you but you want the actual links. To do this you need to click the “download more sample links” tab which gives you a CSV list of actual URLs where you have backlinks.

Do a Backlinks Audit

The difficult part in sorting out the backlinks list is deciding which links are harming your site. There many places on the internet detailing what are considered poor links including

  • Bad neighbourhoods: links from gambling and porn/adult theme sites
  • Anchor text diversity: following from Penguin update anchor text proportion is on every SEO’s mind. MajesticSEO has updatedAnchor_Diversity their tools to show a nice pie chart of anchor text which you can use to ensure there is not too much focus on a one or two money terms instead of a brand term. Common thinking today is that your brand or website url should comprise 75-80% of anchor text with the rest being used sparingly among money terms.
  • An increase in links from one particular site: Again MajesticSEO is your friend in looking at backlink diversity. Webmaster tools can also be useful as it shows the total number of links from each site so you might want to start there. Unnatural high number of backlinks from one site can occur due to placing your site on the sidebar or footer which red flags Google to a paid link.

Link removal steps

Once you have got the clean list it is time to take steps to remove or discount them. Start by writing to the webmaster or site owner with a friendly email asking them to remove your link from their website. Most website owners should not have an issue with removing the links – especially if it means they will reduce number of links on page and get to keep the content of an article.

There are some webmasters who will not respond or try to hold you to ransom by asking for payment to remove a link. In this case Google has given us the new disavow tool to let them know you do not want to be associated with these links. To use the disavow tool you simply save the bad links list as a CSV and head over to the disavow tool and upload the list and hit submit.

Filing Reconsideration Request

Once you have taken steps to get bad backlinks removed or disallowed you can send a reconsideration request. You need to fill in the reconsideration form and submit to the Google webspam team and put as much detail as you can. The form is a mixture of a confession box and bad SEO’s reporting. It asks you to name any SEO who may have created these bad links for you. I guest this is one way for the webspam team to collate information and act where a one individual or company is constantly reported. You also need to provide details of all efforts you have made to get the bad links taken down and if you have asked Google to be dissociated with the links via the disavow tool.

After you fill in the request form you will get a message confirming you are being reconsidered for re-indexing. They do say it can take several weeks depending on amount of reconsiderations they are working on so don’t hold your breath!

 

Google Releases Disavow Tool

Google has finally released a disavow tool for cleaning up bad links. It was widely expected after hinting that Google were working on a tool similar to Bing’s disavowing tool. The tool has also been much needed following Google’s Penguin update which aimed at penalising sites with spam backlinks. This lead to issues of some website owners bombing their competitors sites with bad links – an act know as negative SEO. This further lead to link hostage with some site owners extorting money from webmaster in order to remove the link pointing to their site and ensuing legal threats.

The tool however comes with several warnings as expressed in the release video. Firstly, and foremost it is not for every mom and pop set-up but aimed at advanced users like rel=canonical.

Secondly, SEO’s should not jump in and variously disavow links they think could be below par. The main point of starting should be following a natural links letter from Google. In it Google may even point out a few examples of the kind of links they find to be unnatural. You may also have some links as result of previous spam SEO link building activity that you are not proud of and now wish to disassociate with.

Also, if some links are clearly not ones you would like pointing to your site then you should go ahead and disavow them. These can be malicious links such as ones from porn and gambling sites possibly a result of negative SEO.

Google also suggest the tool should be used only after trying to manually getting backlinks taken down. A take down would be better since other search engines and individuals will still see the bad links pointing to your site.

One thing to note is that Google will not automatically and immediately work on the disavow request. Firstly they reserve the right to ignore the request in the same way they may ignore a rel=canonical pointing to a 404 for instance. The request will also take a while to work as Google recrawls the sites though you can point out your disavow request in a reconsideration. Lastly, if you later change your mind then it will take a lot longer to remove the disavow and may not give the 100% link value it did.

Check the disavow tool here

How to use webmaster tools

Google’s webmaster tools is a powerful software for dealing with a number of technical issues affecting a sight. Below is good practice on how to use Webmaster tools from Google.

The first thing would be to make sure your objective is in line with the company’s overall goals. For instance, if the stated company goal is to increase online sales of specific merchandise, then it will be different to creating sales leads. This could also help prioritise the SEO to spend time on products that already have visibility in the market which is much easier than creating demand for an item.

Signup for email forwarding

Email forwarding enables important messages from Google about your site crawl to be sent directly to your in box. This includes an increase in crawl errors as well as malware information.

Check out search queries

Webmaster tools search queries helps show the keywords that are already doing well in the search landscape. You can highlight buying queries that have visibility and some click through as potentials for improvement. You do this by staring those queries so that you can have a ready-made report every time you login.
You could look to increase CTR for a particular page by doing the search in Google and critically reviewing what the searcher sees. Amend the title and m eta description text if it does not do enough to prompt a click.

Use keywords to understand how to target Your content

The keywords feature are a result of Google crawling your site and they represent what the search engine thinks your site is about- similar to using the keyword tool. You should look at the extracted keywords to determine if they are representative of phrases you wish to rank for then review the content on your site to make it more descriptive.

Reduce duplicate content through HTML suggestions and URL parameter handling

Look at the URLS that are ranking for the queries and determine if there are any duplicate content issues as represented by duplicate URLs appearing for the search query. You can then use several duplicate content methods and specific url parameter handing to demote some urls within the webmaster tools.
Look at html suggestions section and look for pages that have duplicate titles or meta descriptions

Diagnose crawl errors to fix broken links

Crawl status tells you whether Google is able to successfully crawl the sight. You should look at crawl error sources to fix 404 errors so that you capitalize from visitors from other sights and accumulate page rank from external links to the right URL.

Prioritize content through (by comparing) internal links

Look at internal links and make sure all important pages are well linked from the homepage. You can make sure they are better linked internally and have fewer links from homepage.

Verify crawler access through fetch as Google bot

Check as Google bot helps you ensure redirects, dynamic pages and URL rewrites work as expected. Click on the link to make sure the page content is readable and not hidden by java script or image.

Use site performance to improve page loading speed

Site performance feature shows you page load speeds of different pages and offers improvements. You should aim for less than 2 seconds load speed for most eCommerce sights.

Google Analytics in Webmaster Tools – luvly jubbly!

Google has announced integration of analytics into webmaster tools after a a 4 month pilot test program. The process is achieved by linking your analytics account to your webmaster tools account for the same domain(s).

Since the pilot Google has been busy perfecting the tool and it now deems it ready for public use.

The tool extracts data from two sets and ingrates it to bring more detailed reports.

Google analytic in webmaster tools

For queries data you get number of impressions, clicks, organic position of keyword and click through rate (CTR) for the top 1,000 query phrases.

For landing pages you will also be able to see number of impressions and clicks, organic position of keywords and CTR for the top 1,000 landing pages.

This is a free tool which combines two sets of data from the two free programs that are highly recommended for data analysis. You can sign up to Google webmaster tool here and Analytics tool here. Combining the two is easy using Google instructions.

I have wondered many a times why I need to download both sets of data sepratly and and try to somehow reconcile searches keywords with impressions and organic rankings data. Now with this there is no need to mess about with excel spreadsheets with the the data in one place.

You will be able to see the success of your SEO efforts by seeing increasing or decreasing in average ranking position and relating it to visitor numbers. With Google Analytics in Webmaster tools an SEO services consultant will easily show their client change in visibility, impressions and clicks to justify their work and ROI given average margins for the product or service.

 

Robots.txt file – Do I Need It?

A robots.txt file basically tells search engines which part of your site not to crawl. The format is to put it in your top level directory in the format. It is useful in restricting parts of your site you do not want indexed by certain search engines. Expamples include:

 

User-agent: *
Disallow:
Examples of Other formats include the following

Disallow All crawlers

User-agent: *
Disallow: /

Restrict folder

User-agent: *
Disallow: /private/

Restrict file to all robots

User-agent: *
Disallow: /directory/file.html

Some may wonder whether it is worth having a robots.txt file at all if you want to give search engine crawlers unrestricted access. Matt Cutts has answered in his weekly digest that it is useful to have the files even if it is to say disallow none.

Enhanced by Zemanta

Canonicalization Link Element

The canonical link element is something that Google, Yahoo and Bing search engines said they would support at SMX West way back in 2009. The element addresses two problems that webmasters find in getting ranked – duplicate content and splitting back-link equity between different urls that are basically the same. Examples include different homepage urls as follows:

www.yoursite.com
yoursite.com/
www.yoursite.com./index.html
yoursite.com./index.html
…. And so on.

Other good examples of internal pages for an ecommerce site are where the product comes in some variation that may not be of immediate interest to surfers such as a different color or size. In this case you can point all variations such as yoursite.com/widget-blue/ to yoursite.com/widget/ as the preferred url to get indexed by the search engines. Once a searcher lands on the relevant page then they can make a choice of various colors, sizes etc.

Canonacalization

In a most cases the issue would be fixed upstream by:

  • Getting the CMS to always produce standard urls.
  • Be consistent with your internal linking though you cannot always control how someone links to you externally.
  • Redirect pages such to the preferred URL with a 301 redirect such that the other pages automatically serve up a standard page.
  • For the homepage, you can also specify www or http: preference in Google Webmaster tools.  You can also submit one preferred URL to your sitemap which Google will take as your preferred URL.

However if you cannot fix the issue before publish with one of these options then one of the best options is to use a canonical tag element

<head>

<link rel=”canonical” href=”http://www.yoursite.com/preferred-url”>

</head>

You simply place the preferred url in a canonical link element within the head tags of the page. That is all for now, we shall explore more canonical issues in future posts.

Below is the video announcement by Matt Cutts announcing support of the canonical link element after SMX 2009.