Building effective footprints is a must for any link builder or SEO wanting to scrape the search engines. Whereas we all think we have it wrapped up after a lot of experience working with footprints every day, there is always something new to learn. Looplines of Scrapebox introduced the idea of “minimum effective dose” which can help stop extending a footprint from being too long. The idea is to use minimum effective keywords in order to return results while discarding non-essential keywords such as “proudly” in “proudly powered by”.
I find myself looking up footprints almost everyday and having to remember which spreadsheet, bookmark or word document to look in. So it was a delight to stumble upon chase the footprint – http://chasethefootprint.com/.
At first it looks like a very simple tool and hardly worth a mention but when you dig deeper you find it’s treasures.
There are common top level categories like guest blogging, directories, forums and blog commenting. But each contains sub-category of all the footprint terms you could ever imagine. You still have to use some imagination like sometimes putting keywords in quotes and negative phrases to pinpoint prospects.
When you hit return you are immediately taken to google search with the foot print filled in. After that it is a question of sifting through to find good prospects.
There are instances where you want to scrape a list of urls for link building but would like to stay relatively current. For instance you scrape forums for your keywords but end up with very old threads that get you instantly band when trying to post. In such cases you want to restrict results either to a the last x months or to a particular date range.
Restrict to last x months
To restrict results to the last x months you put the search operator date:x where x is the number of months to go back. So for instance date:3 in your query returns results indexed within the last 3 months only.
Dates within a range
Its a little trickier to restrict dates to a particular range. Basically you put your date range in normal format at the site and it converts to Julian format. To convert normal dates to julian format there is a tool at http://jwebnet.net/advancedgooglesearch.html#advDate.
You then add this to your advanced search using daterange: operators As an example “keyword daterange:2455928-2456293” returns results of your keyword indexed in 2012 (1/1/12-31/12/12).
“Pre-Panda/Penguin, there was a pretty specific formula that would
work the majority of the time for ranking a website in Google.
Obviously, this was dependent on the competition too, but the key
strategies went something like this…
1. Get a bunch links – a lot of them
2. Get links with your keyword in them
The formula looked like this:
– 60% of your links should be your main keyword
– 15% of links are your main keyword + other words (eg: read about,
check out, etc)
– 15% of your links generic text (eg: click here, visit site)
– 10% of your links as the text (eg: your link URL used as the text)
And like a lot of people have experienced, when Google’s
Penguin update came around, it effected this “formula”.
BUT there’s one very good reason that some of our sites
didn’t get hit at all, or recovered extremely fast. And it has
absolutely NOTHING to do with where we were getting the links from,
but more so with “how” we were linking… i.e. the link
density and anchor text we were using.
We quickly realized that the old way of doing things needed to be
changed. Slamming your sites with a bunch of keyword rich anchor
text is finished – it won’t work.
But the reason we’ve now rebounded is because of the following
1. It wasn’t the links that were the problem
Getting a ton of links still works. Do I need to repeat that?
Most people froze and just stopped building links altogether
fearing that if they continued, their sites would get hit even
They were wrong. You see I figured that if you suddenly stopped
business as usual, that it would be a sure-fire sign to Google that
you were up to something and then they would hit you even harder.
They were looking for a reaction on your part. If something
suddenly changes, you’re likely gaming the system. If your
links are in fact natural, well then why would anything change?
So we got that part right and now we’re almost back to
pre-Penguin traffic levels and rankings.
But there was a second thing…
2. Anchor text links have CHANGED…
Our formula for the amount of keyword anchor text we used was now
defunct. We didn’t change it right away (keeping with what I
outlined above), but we did slowly phase in a new formula.
The new formula goes a little something like this (obviously
adjusting as needed, based on our competition too):
20% – your main keyword that you’re wanting to rank for
25% – related LSI keywords (these are keywords related to your main
keyword. Think of these as the “related” searches that
Google shows you as you’re typing in their search field).
35% – your URL as the anchor text, as well as other combinations of
20% – generic keywords
Google is now looking for a more diverse linking profile – a more
You may want to check the status of many pages in a website to know if they are alive or dead so as to decide what to do with them – e.g. redirect them to a live page.
Create a list of Indexed Pages
The first thing you want to do is create a list of webpages for the site. You can do this by using Screpbox sitemap plugin or by using the site:yourdomain.com command to get a list of webpages indexed in Google, Yahoo and Bing. Once you get this list you need to remove duplicates from the different search engines so that you have a clean list to work with.
Check Server Status
From Scrapebox instal the Alive Check Addon and run it for the list of indexed urls. You can tell it what status code to treat as alive such as 200 success or reverse check for 404 pages. You can also choose whether to follow a 301 permanent redirect or count it as dead. This will provide a result of the urls with comments like Alive, Dead or failed. In case of Failed it is useful to recheck a few times in case your site server is rejecting your request as precaution to too many requests.
Check Backlinks using Scrapebox
Another useful addon lets you check if your links are still up on a particular webpage. The link checker plugin works by checking a bunch of webpages or blogs for mentions of your site. You can also use it to check if the links are nofollow or otherwise. My experience with the addons is that you need to run the server status error list a few times as it time-out before finding an actual link. However, the link not found report was accurate.
Below is the instruction video for both addons:
A recent review of car finance websites resulted in the following backlinks SWOT analysis.
We took the top keywords relating to car finance, car loans and car credit and ran a rankings report for Google and Bing. Car loan 4u and money supermarket are tied in first page rankings with 61 keywords each in Google. However Car Loan 4u totally tanks in Bing with no ranking for main keywords.
Backlink history shows carloan4u has had higher velocity of backlinks than its main competitors but has a slowdown in last quarter of 2008 with the same domain diversity. This is probably as result of less site wide links in navigation and footers. Creditplus has less velocity of backlinks history but a greater diversification of where the links are coming from which has always been aspects of a good links profile.
Anchor Text Diversity
Domain Anchor text distribution shows a good distribution of main industry keywords and some brand name links for most sites. This is important due to the recent Penguin update targeting over optimised backlinks so there needs to be linking with brand terms so as to get at least 75-80% share. However, the high proportion of money keywords in Car Loans 4U to brand terms appears unnatural.
The keyword “Car finance” is the dominating search term in the industry and a forecast shows it will continue to be a dominant keyword in 2013.
Search Engine Visibility Opportunity
Pursue white hat link building through:
The object is to review various link building tactics for your website and look at their pros and cons. Objective of the exercise would be to increase the number of links from high quality sites and blogs.
As a link building service we would review each tactic in relation to client website and try implementation a few of the different tactics and share successes/failures on a portfolio.
Tactic: Link Building through Social Media
Link Building through Social Media by reviewing new and existing tweeter followers who are industry leaders and engaging them with the aim of eventually asking to link to us. The process is discussed on SEOmoz white board Friday Scalable Link Building Using Social Media.
This can be a long term relationship which we can use to promote a particular content or just hope to be in journalist/bloggers mind next time they think of link resources for their new article. The process can be complex and time consuming but hopefully streamlined by BuzzStream social component.
Pros: Such relationships can be more long term and far reaching than cold email requests. We have the advantage of using the BuzzStream social tracking ability.
Cons: It is very time consuming building and maintaining such relationships.
Tactic: Links from Journalists and top industry bloggers
We will search related content on big news media sites such as BBC, New York Times or CNN as well as industry niche news sites. These sites have potential to bring hoards of traffic as well as increasing rankings. This technique will involve outreach to bloggers or journalists who are writing content related to our projects or feature articles.
Process is as follows:
- Search bloggers/journalists writing about our feature/project subject
- Ask them to include our link or write a review of our feature
- Entice with social mentions on our network
The proposal will be in the form of:
- A link to our page with similar content will be useful to your readers
- Appraise our content and link to it
Pros: These sites have potential to bring hoards of traffic as well as increasing rankings.
Cons: It is difficult and time-consuming to find contact details of journalists
Tactic: Competitor links import
This involves putting the competitor website into open site explorer to find the links pointing to their site. Downloading the links in a csv file and importing it into BuzzStream. Once in Buzzstream we sort out and keep high value prospects for outreach.
Pros: It is easier than searching for new links and they are more likely to link to us if they have linked to similar sites.
Cons: The competitor might find you are following their links and do likewise with our links.
Tactic: Dead link building
The process involves finding out links pointing to dead pages to our sites. We need to download dead links pointing to our sites and contact webmasters pointing out the error and providing an alternative link source. It helps both sides because you are helping the webmaster as they do not like to link to dead pages.
Pros: The website has already linked to us in the past so likely to link again.
Cons: It is a time consuming and tedious process requiring technical ability and follow ups.
Competitor Dead link building
This involves running the chrome plugin on the links/resources page to discover any dead links. We can then use a template to contact the webmaster with suggestion of an alternative page on our site. The process of looking for dead links should happen as matter of course when looking at links/resources pages.
Pros: It presents the webmaster with a solution so it might build long term friendship.
Cons: It takes extra effort to note sites while scouring the industry websites.
Tactic: Approach sites already linking to us
Sites that have linked to us in the past are important as they are likely to link again so they should be approach for further link building opportunities. They should however be evaluated in terms of link value and traffic that they have generated. We will form a relationship with high quality sites that bring in traffic so that we can propose a new content to link to.
Pros: Webmaster is more likely to link if we point out they have done so in the past.
Cons: We need link diversity from new sites so not too much of this technique can be followed.
Tactic: competitions and giveaways
We can run a link building contest among bloggers with prizes like free advert or social media promotion. This can involve asking bloggers to write particular content about our site and awarding the best article. A recent good example can be found at http://kaplaninternational.com/blog/how-to-learn-english/.
Pros: A prize or give-away is an easy way of getting attention of bloggers.
Cons: You may have to wade through a lot of poor content to pick a winner. Some webmasters might remove our link once they find out they have not won.
The process involves producing Infographics out of images and content in projects, feature articles and white papers.
Pros: You can watermark the image so that visitors know where it has come from.
Cons: The process might require external resources to produce the Infographics. The bloggers might not always link to the page you want them to.
Tactic: Phone calls
We need to phone and talk to some of the people we have made email contact with. The process involves making phone calls to link prospects already contacted starting with ones already having email dialogue, social media contacts and pure cold calls.
Pros: Hearing a voice related to a name might help to alleviate the fear that you are not a spammer.
Cons: Some people do not take kindly to cold/warm sales-type calls.
Tactic: Create movie via slide from images
The method involves:
- Using Animoto or Windows Movie Maker to turn images on our site into movie files.
- Upload to YouTube and other video sharing sites
- Moderate comments
- Promote channel via blogs and other social media
Pros: Having a YouTube channel creates another avenue of traffic as videos are part of organic SERPS.
Cons: You have to moderate comments to avoid abuse and spammers like all social media presence.
Tactic: Create a widget for placing on relevant sites/blogs
Create a widget with a feed of our content and offer it to relevant sites/blogs as a news/image source. An example can be seen at http://www.reddit.com/r/linkbuilding/widget/.
Pros: Create a useful widget and it can be used several times across the web.
Cons: Time consuming to create a useful widget and it may involve web design team. The images are not in central location to be pulled by feed.
Tactic: Use client email list to request links
The idea is to use our lists of client data to approach them for a link back to a relevant our site. You need to discuss with relevant authority like sales managers how best to go about this without upsetting client relationship.
Pros: Our clients are very likely to link to us as we have a common interest.
Cons: It may be viewed as reciprocal link building which has been devalued by search engines.
Tactic: Encourage newsletter subscribers to link to us
This will be in the form of Encourage people to link by asking, “if you’ve enjoyed this newsletter, you can link to the permanent version at (insert URL)”.
Pros: Our newsletter subscribers may be more willing to link to us. It can increase social media likes and shares.
Tactic: Adding link prospects while browsing
Link prospects appear all the time while browsing within a particular niche. It is therefore useful to be mindful of link prospects even while browsing or searching for similar content. Finding a good value site and noting it can easily be done via the buzzmarker plugin to import into prospects.
Pros: This helps to steadily build prospects for future outreach.
Tactic: Getting links from links/resource pages
The process involves searching for link/resource pages by keyword and asking the website to add our link. We have been using this technique for a few weeks and have got some links but a lot of webmasters are ignoring the deep link request and linking to the homepage.
Pros: It is easier to propose a new link where there are similar links.
Cons: Links from these pages may become devalued by search engines in the future. Some website owners request a reciprocal link which we are not able to give. Most webmasters link to homepage instead of deep linking.
Tactic: Get citation links from Wikipedia
Target Wikipedia as part of our link building as follows:
1) Create an account on Wikipedia. Resource: http://en.wikipedia.org/wiki/Wikipedia:Tutorial.
2) Search Wikipedia for a highly searched issue in feature articles.
3) Scan through the feature and Wikipedia to look for a place on Wikipedia where you can add a citation back to our feature article.
Pros: Wikipedia ranks well for several research type queries related to NRI sites so there is potential to gain extra traffic.
Cons: The links in Wikipedia are no follow so they will contribute little to ranking. Amount of click- through is low because citations are at the bottom of the page.
Tactic: Start blogs to host images and videos
The idea is to start industry blogs that we can send images and videos with the hope that they will start ranking and pass traffic.
Pro: It is easy to setup a blog and upload images and videos.
Con: This was a good idea in the past but is doubtful now because of Google clamping down on blog networks.
Submit your profile to internet directories used to be an assured way of getting a quick backlink. However Google has devalued a lot of general directories duie to their link farm nature. There are a few niche directories that are worth submitting to depending on the industry which we shall explore.
Pros: They are a quick way of getting a backlinks
Cons: Google is on the warpath against directories so very few will remain that are worth submitting to.
Google has finally released a disavow tool for cleaning up bad links. It was widely expected after hinting that Google were working on a tool similar to Bing’s disavowing tool. The tool has also been much needed following Google’s Penguin update which aimed at penalising sites with spam backlinks. This lead to issues of some website owners bombing their competitors sites with bad links – an act know as negative SEO. This further lead to link hostage with some site owners extorting money from webmaster in order to remove the link pointing to their site and ensuing legal threats.
The tool however comes with several warnings as expressed in the release video. Firstly, and foremost it is not for every mom and pop set-up but aimed at advanced users like rel=canonical.
Secondly, SEO’s should not jump in and variously disavow links they think could be below par. The main point of starting should be following a natural links letter from Google. In it Google may even point out a few examples of the kind of links they find to be unnatural. You may also have some links as result of previous spam SEO link building activity that you are not proud of and now wish to disassociate with.
Also, if some links are clearly not ones you would like pointing to your site then you should go ahead and disavow them. These can be malicious links such as ones from porn and gambling sites possibly a result of negative SEO.
Google also suggest the tool should be used only after trying to manually getting backlinks taken down. A take down would be better since other search engines and individuals will still see the bad links pointing to your site.
One thing to note is that Google will not automatically and immediately work on the disavow request. Firstly they reserve the right to ignore the request in the same way they may ignore a rel=canonical pointing to a 404 for instance. The request will also take a while to work as Google recrawls the sites though you can point out your disavow request in a reconsideration. Lastly, if you later change your mind then it will take a lot longer to remove the disavow and may not give the 100% link value it did.
Check the disavow tool here
Guest blogging for links has become a common practice amongst link builders lately due to doors shutting on other traditional forms of link building. However many have wondered whether Google would view such practice as an illicit form of link acquisition so it was interesting to get an word from Matt Cutts on the subject.
Google’s position appears to be that If you are a high quality blogger and want to write posts on others community space to bring knowledge and insight, then that should be welcome as a positive contribution. This both informs and brings insight to a community from a blogger that is possibly less well known.
What Google don’t want is people offering the same post multiple times or spinning the same content and offering it elsewhere. They also discourage the practice of posting low quality content such as those outsourced to freelance writers who are clearly not experts in the field.
Writing short guest posts is another sign of low quality contribution. This happens when the guest blogger aims to write the bare minimum of 300 words just to get the post published on another site just for links.
It is good that Google has cleared the air on this now common way of link building. However, it should not be viewed as cart blanch the only officially acceptable way of acquiring links. Sure guest posts contribute to the growth of content online as quality blogs invite writer to fill space and inform their readers. It should be used as part of an overall link building strategy.
Below are some of the steps I would take in creating a link-building programme.
Keyword research is the foundation of any link building campaign so I would decide important keywords for homepage and high value product category pages. In general the keyword strategy for categories will consist of choosing high traffic key phrases that describe the products and services of each category.
Rankings & Visibility Report
Report on Rankings, competitors and visibility at start so that regular measurements can be used as KPI’s. I like to use advanced web ranking software or similar for checking rankings and visibility of important keywords over time compared to competitors.
Establish KPI’s, Timeframes and Reporting Cycle
I would be looking to agree KPI items like:
- Increase in number of quality links from highly relevant sites
- Increased visibility of major keywords compared to main competitors
- Increase in number of organic visitors
- Increased organic conversion rates as evidenced in Analytics ecommerce report
Perform a Link Audit
I would perform an audit of the links pointing to mysite.com using a program like Open Site Explorer from SEOmoz. This would inform me of dead or spam links.
Dead link recovery
This is one of the easy low hanging fruit activities that can be used to acquire links from websites linking to deleted or moved pages. The process involves finding out links pointing to dead pages as per audit. We need to download dead links pointing to the sites and contact webmasters pointing out the error and providing an alternative link source. It helps both sides because you are helping the webmaster as they do not like to link to dead pages.
Low quality links Removal
The process involves making a list of low quality, spam links, running the list through a program to acquire contact details and writing to webmasters with a removal request.
Competitor links Acquisition
This involves putting a competitor site into open site explorer to find the links pointing to their site. We then sort the links in terms of value and delete ones we would not like a link from. We can then import the csv file into outreach software (e.g. BuzzStream) to get contact details and manage correspondence.
Daily link building strategy
For homepage and important product category I will develop a strategy to acquire links from industry bloggers, journalists and supplier websites. I will work with PR team to ensure content and distribution is optimised.