Manual spam action revoked

Webmaster Tools- Manual spam action revokedFurther to our Webmaster Tools unnatural outbound links manual action Reconsideration request last week, we have received another email from webspam team to say the manual spam action is revoked. This was not for one but three sites and non had ever accepted payment to publish content. One was heavily reliant on content from a guest blogging network that has reportedly been subject of Google network clampdown and the other two had one or two articles in the past. However, the sites were and still are listed in the blogging networks directory list as seeking content from guest writers.

On a positive note I have been approached by guest bloggers to publish content this week and they have agreed to do so despite there now being nofollow external links on the site. It is a shame the blogging network was subject to the action as it was a great meeting point for writers and publishers and I have personally enjoyed the relationships built.

[contact-form-7 id=”1139″ title=”Contact form 1″]

Unnatural outbound links

Unatural Outbound links noticeLast week we got a webmaster email telling us that Google thinks we have been selling links on three our blog posts. This was a surprise as we had not participated in any link selling activity on these blogs but had participated in accepting blog posts from We also noticed our page rank has disappeared back in December PR update.

Actions to remove links

We could not go through three years of blog posts to remove links that we suspected were in contravention of Google guidelines nor could we nofollow each one individually. Since the blogs were in wordpress our result was to use a plugin to unfollow all external links.

WMT reconsideration request

Following the implementation of the unfollow plugin we replied to Google through webmaster forum to say what we did to address their concerns via reconsideration request. We now wait to
hopefully get a positive feedback from big G.

Update: We see from this post on search engine land has recently been hit under the Google blog network clampdown. This is a surprise since we were given to understand that guest blogging activity was an accepted practice.

A/B testing vs multivariate testing

A/B testing is an easy way of comparing two versions of a page to know which converts better. Multivariate testing on the other hand compares multiple changes of possible web design and can be expensive and complex.

How does do the two compare

A/B requires little development and can be implemented by SMB’s with limited budget and technical knowhow. You can get a crowd source web page design at very low cost and use Google analytics to setup the tracking parameters.

A/B compared to multivariate testing –


SEO copywriting writing best practice

This SEO copyriting best practice is all about Getting traffic from Google and other search engines. Writing for search engines is not that difficult as long as you follow some simple ranking rules about:

1. Writing appropriate original content that people are looking for.

2. Make sure you use keywords and synonyms in writing so that Google know what your content is all about.

3. Linking to and from relevant content via hyperlinks, images and video.

4. Giving images alt (alternative) attributes.

Keyword research

It is a good idea to do some basic keyword research before writing any content in order to know what people are searching for. You can easily do this by:

  • Searching in Google and seeing what other relevant content is suggested as being popular.
  • Gauge the keyword commercial values by seeing the number of advertisers appearing in the PPC section. But note not all newsworthy material has commercial value and some future events may have no PPC interest at present.

There are also free keyword research tools like the following:

1)      [if gte mso 9]

2)      [endif]

AdWords Keyword Planner

You now need to create an Adwords account to use the Google keywords tool. It shows how popular a particular keyword may be based on the number of people searching for it. There are several filtering methods including country, language, include/exclude words etc.


Although this information should be taken with a pinch of salt, you can gauge the popularity by simply looking at the following factors:

•           How often a Search phrase is used Monthly

•           How Competitive the Term is in Google


Another excellent keyword research tool is This tool simply scrapes Google suggest list of keywords to show what is popular in searches related to your keyword. This is quote useful because Google suggest influences searchers behaviour in some cases. You can take suggested keywords and use them in Adwords keyword planner to see how many people are actually searching with that keyword.

Below is by no means a full list of best practice but inclusion of some of these factors will help to get your article/blog ranked well and thereby able to get good traffic from the search engines. When writing content we should try to follow some of these simple guidelines:

SEO Perfectly optimised page

  • Keyword research: This forms the basis of how you will structure your article based on what people are searching for. Use at least one or two variations of a term and potentially splitting up keyword phrases
  • Keywords Spread: Good practice is to include the phrase towards the top of the document and then a few times through the document and again towards the end of the document with possible hyperlinks of where to go for more information. Aim for no more than 1-3 keywords and synonyms per article blog post.
  • Keyword Repetition: Repeat keywords two to three times every 100 words- i.e. keyword density 2%-3% and try to make use of.
    • You can use this link for keyword density analysis: [endif] .
  • Write original content: Use content that is not found elsewhere online. It is thought that Google ignores duplicate content when ranking. If you are basing article on content from elsewhere then make sure to change phrases and paragraphs (or use quotation and refer to source).
  • Keyword rich URL: Include keywords in urls where possible. Further reading [if gte mso 9]
  • Page Title: Use the exact target keyword phrase in the document’s title. The most important of on-page keyword elements, the page title should preferably employ the keyword term/phrase as the first word(s). Further reading [endif].
  • Document Length: Try to write 350-500 words on a single ranking subject (keyword).
  • Bold Highlight: You can highlight some (bold) important words in a paragraph.
  • Page Headers: Break long articles into headers (H1, H2, and H3) with section keywords if possible.
  • Link to relevant content: Use keywords to link to main site and other relevant articles in order to pass link value and call to action.
  • Link out to Authority sites: Link some important words to [if gte mso 9]/pages when appropriate but not the main keywords and not competing sites. E.g. if you are writing about sports you can link to,, where relevant.
  • Link Internally: Try to link internally within blog with related topics in order to boost value of blog by linking to other blog story about similar subject. Also link to main site for passing SEO value and a call to action.
  • Image alt tags: Use a keyword rich alt tag that describes the image. Google uses this as further evidence of what the page is about.

Don’t let these guidelines procrastinate you!

Finally, do not let any SEO copyriting best practice guidelines hold you back from creating content as any content is better than none. Also ensure the guideline are not going against the tone of voice of the organisation communication.

Local Citation SEO boosts Visibiity by 179%

An Agency has shown that doing some basic Google places optimization has increased visibility by 179% for 315 businesses in the US.

The steps followed by the Agency included

  • Creation of custom businesses description for each location
  • Add more content to do with photos, videos etc.
  • Remove duplicate listings on Google and other places
  • Amend listings for long and short tail keywords


See the  reported at Search engine land

Disposable credit card numbers

creditcardnumbersOne of the benefits of signing up with a service like Paypal is that you can go to the backend and remove the running weekly/monthly charges from your profile. However the same cannot be said with a credit card signup as it makes it more difficult to unsubscribe.

Well .If you ever had to sign up for a product or service with a valid credit card but were afraid to be billed for ever then this could be a perfect remedy. generates a valid credit card number that you can use across the web for testing your ecommerce site or signing up to sites that insist or credit cards. A useful service to sign up for stuff you just want to try out before buying

Advanced Keyword Research Tools

keyword-researchThere a number of advanced keyword techniques one can use to expand their keyword terms. Some good techniques include the following:

Google keyword planner

This is the only tool from Google following the demise of the external keyword tool. The process includes:

  • Put a bunch of keywords in the keyword planner tool and:
  • Limit to 100 searches or more
  • Add negative keywords or keywords to ignore
  • Add positive keywords to include


Scrape your site for top landing pages

Scrape your blog or website and take the top landing page visits.

Put the top landing pages into keyword planner tool and gather their keywords that are most relevant based on searched.

Add negative keywords to exclude

Add positive keywords to include e.g. if you were looking for content for a blog post you might want to include words like:

  • who
  • what
  • where
  • to
  • vs



Take generic keyword and put it in Ubersuggest. Grab a list of the most relevant keywords searches and:

  • Put in keyword planner tool
  • Only show results above 100 searches per month
  • Add negative keywords or keywords to exclude



Put a bunch of keywords into scrapebox keyword tool and choose search level up to 4. This will bring up searched keywords then put them back into scrapebox for searching on a further level.

Download the keywords from scrapebox and put them into the keyword planner tool.

Below video discusses some more advanced methods.


Footprints revision: using minimum effective dose

FootprintsBuilding effective footprints is a must for any link builder or SEO wanting to scrape the search engines. Whereas we all think we have it wrapped up after a lot of experience working with footprints every day, there is always something new to learn. Looplines of Scrapebox introduced the idea of “minimum effective dose” which can help stop extending a footprint from being too long. The idea is to use minimum effective keywords in order to return results while discarding non-essential keywords such as “proudly” in “proudly powered by”.

Does having more pages make it easier to rank?

many pagesWell, yes and no. Many webmasters and SEO may assume that the more pages you have on your site the easier it is to rank for anything. But, according to Google’s Matt Cutts, that is no necessarily the case. More web pages do not equate to easier rankings for a target keyword. However, having many pages on a site can help rankings by getting deeper crawling and attracting more links for those pages and therefore passing page rank around the site. Having more pages on a site also gives you better chance to rank better for more targeted keywords of individual pages.

No PageRank Update Soon

PageRankMany SEOs and webmasters have been wondering why their sites PageRank is not changing despite a lot of activity in publishing fresh content and getting quality backlinks.

No PageRank Update Soon

Google will not update PageRank Toolbar any time soon according to Matts Cutts. This has come as no surprise since many in the SEO community have waited for one of the longest periods between PageRank updates the last one being February 2013. It has also been rumoured that Google may do away with Pager rank altogether.

This revelation will make the work of SEO much tougher along with loss of Analytics organic traffic information about keywords used to find your site and removal of the keyword tool that many used to research keywords for page content and link building.

The reason Matt Cutts gives for Google not updating PageRank as often is that it is not easy to install the toolbar on internet explorer and chrome does not have it. He does not mention Firefox and some chrome plugins that support this data.

Content Does Not Affect PageRank

Another eye opener was that Matt Cutts clarified PageRank is not actually about the amount and quality of content on the website directly. PageRank is more about the number and quality of backlinks. If sites linking to you have high PageRank then they will pass that on and improve your PageRank too. However, it is a bit of chicken and egg situation in that if you produce quality content then quality sites may be more willing to link to you.

Site Structure

A good pyramid type site structure is also important in passing PageRank. Ideal structure is where important pages are no more than a couple of clicks away from the homepage.