Google Webmaster Guidelines – Back to Basics

Webmaster guidelines are the basic rules that Google provides to make it easier to crawl your website. In an internet world that is full of experts it is always good to review the guidelines every now and then.

Google webmaster guidelines are split into three levels as follows:

1)      Technical Guidelines

2)      Quality guidelines.

3)      Design and content guidelines.

We take a brief look at each:

Technical Guidelines

Google Webmaster tools Technical Guidelines

You should know that Google bot works on the text written words so java script, frames, flash and session IDs will restrict the amount of content it can index.

You should make use of robots.txt file to tell Google bot about any content you don’t want indexed.

If you use a content management system (CMS), then make sure it produces pretty links and not urls with extended IDs that do not describe the page.

Make sure that there is not much restricting the load from loading fast as load times are now a (small) ranking factor.

Quality Guidelines

Google Webmaster tools Quality Guidelines

These guidelines are aimed at not tricking search engines. For instance you should avoid putting different content for users and search engines. Google also gives an outlet if you believe another site is abusing these guidelines and flying under their radar. You have the ability to report them through a dedicated website for Google webmaster tools spam report. It’s all bout main ethos of Google of giving users great experience by providing unique content that adds value to surfers and the internet in general.

A good rough guide is to ask yourself if you would put out the same content if it was not for search engines.

They also ask that you should avoid web scrapping programs aimed at constantly checking your rankings though this is difficult to avoid all together in the work of an SEO consultant.

Design & content Guidelines

 

Google Webmaster tools Design & Content GuidelinesThese are very important guides as they relate to the structures of the site and its effect on crawling ability. For instance, if every page is not reachable from another then it could be tantamount to sending the Google spider done a dead end road. This can easily be fixed by making sure the sidebar navigation somehow connects to every page on site.

As far as content is concerned Google asks that it be informative and avoid churnalism type of repeating the same content as many other places elsewhere.

You should do keyword search and make sure you use those words that users of that text will type into a search engine to find it.

You should check for broken links via webmaster tools and other sources and try to fix them.

You should also ensure that your page titles and meta tags are descriptive of the content in the webpage.

Remember that each link from a page reduces its value so try to keep total number of links to a reasonable number.

Related Blogs