Google Penguin is now running in Real Time

google-penguinOn Friday, September 23rd, 2016 Google announced that the Penguin algorithm had released its newest update. This is big news for site builders and managers who have been affected in the past. Just as Google did with Panda when it became part of the core algorithm, Penguin now works in real-time.

The Penguin filter was first released in April of 2012 as a standalone program. The goal of Penguin was to catch sites deemed to be using spammy link building practices and penalize them. This algorithm helped to discourage many low-quality link building techniques that were popular at the time.

This update also means that Penguin will join Google’s core algorithm. It will be added to over 200 other signals in the main algorithm. Penguin 4.0, as its called, is the sixth major update Google has confirmed. Google also stated they will not announce or confirm future updates for either Panda or Penguin.

There are two big changes that come with this update.

Penguin now runs in Real Time

Since it’s release Penguin updated on a very sporadic schedule. Because it ran outside of Google’s core algorithm it had to be manually refreshed. This led to the list of sites impacted by Penguin going months without getting updated. Because of this, penalized sites could wait a very long time for link fixes to count in Google’s ranking algorithm. Now, Penguin will update on the fly.

Now that Penguin is running with the rest of the core algorithm data should update when pages are recrawled or reindexed. Webmasters can now use a handful of programs that will ask for Google to crawl a site. It is possible penalties will now disappear in hours, not months. On the flip side, gaining a few spammy links could quickly lead you to be penalized.

Penguin is more granular

Google was not exactly clear with this section of the update. The general belief though is this means pages will be evaluated on an individual basis. This will be a change from earlier versions which looked at domains only.

This will be interesting to watch as data becomes available. Sites with only a few pages utilizing spammy link tactics could see an overall boost in traffic. On the other hand, sites that received penalties from the original algorithm could receive even more penalties now.

What does this Update Mean?

Many, many site owners felt a direct impact from Penguin 3.0 and could instantly see improvements in traffic. By the time you read this, hopefully, you already have. Analyzing the effect of Penguin will be more difficult now, though. Site owners will now need to go through the site page by page.

If you know that your site has a Penguin penalty looming now is the time to fix it. There are a handful of tools online that grade links by spam score. Once you’ve identified your spammy links, use “disavow file” to get rid of anything with a high spam score. Within days you should see a big increase in organic traffic.

Overall, this update is not a surprise to anyone. Google will continue to take steps to guarantee the results that are ranking on the first page are of the highest quality. As long as site owners produce high-quality content and acquire links in an authentic way the Penguin penalty will not be a concern.

Best practices for bloggers reviewing free products

As a way to market themselves online, some companies offer blogs- that either partly or fully lend themselves to product reviews- free products in exchange for favourable online reviews and occasionally testimonials. The situation is widely perceived as win-win with the blogger benefitting from the free product while the favourable review increases the company’s online recognition and exposure.

Free product review blogs came to the fore in the US when the Federal Trade Commission published guidelines requiring that bloggers disclose the full nature of their relationship with the companies behind the products they advertise.

The success of free product reviewing as a means of advertising has increased the need for the regulation of this kind of blogs.

Google, on March 11, published a notice on their webmaster blog highlighting guidelines to bloggers receiving free products from manufacturers on best practices’ to avoid getting penalized by the search engine.

Failing to meet the guidelines could even end up causing some of the blogger’s articles not to show on Goggle search. The three practices are listed below.

Use nofollow tags on the link

The nofollow tag was first introduced in the early 2000s as an instruction to search engines not to follow the particular link. When manufacturers offer free products to blogs in exchange for reviews and testimonials, they request that the blogger links back to their company home page, their social media pages or affiliate pages.

Google required that bloggers use nofollow tags on such links because they were not organic. According to Google, since the company had to provide a free product or free service or pay for the links to exist, then they were artificially generated. The blogger, however, is not required to introduce nofollow HTML attributes for everything on their blogs; just the affected links.

Google also suggested that since the penalties for contravening this guideline would be far reaching enough to affect even the sponsoring brand’s online visibility, the companies needed to play their part in by reminding bloggers to appropriately use the nofollow tags.

Full disclosure of the relationship

The guidelines also required that users be informed when they are viewing sponsored content. This is already a legal requirement in some countries which require that bloggers disclose when content is sponsored. The purpose of the disclosure, along with following Google’s guidelines, was also to inspire trust from them.

Google also recommended that the disclosure of the relationship between the blogger and manufacturer be placed at the top just in case users didn’t finish reading the whole article.

Create Original content

It is important as a content blogger aiming to create a successful niche blog that you offer visitors on their blogs a compelling reason to come back. This can be done either by providing useful exclusive well-researched content that no other blog has to offer, or by covering a unique niche that you are the go-to source of information.

In typical Google fashion, the search engine sent out penalties to product review blogs during the weekend running from April 8th to April 10th, just a few weeks after putting up the notice. Google sent out manual actions to the bloggers that failed to take heed of the guidelines posted on their webmaster blog.

About Google’s real-time, A.I. updates

It’s that time again! Toward the end of last year, Google released news confirming major changes to be implemented to their algorithm, including a Real-time’ version of the spam fighting Penguin as well as a developing A.I. called RankBrain’ to team with Hummingbird.

Google’s algorithm is a multi-faceted and ever-evolving entity with infinite sophistication. If these descriptors resemble how you might characterize an omnipresent deity, then I would say you are considering the developing changes to Google’s search algorithm with an appropriate level of weight.

In recent years, Google has become much more transparent about impending changes to the algorithm and has even provided names for the inner workings responsible for specific tasks;

– PagRank and Hummingbird for the processes associated with distributing credit and authority to a page,
– Panda and Penguin for the developing processes aimed at battling spam and penalizing dishonest SEO practices.

What does a Real-time’ version of Penguin mean for me?

Presently, Penguin refers to the aspects of Google’s crawl that are aimed at locating and determining the validity of all links pointing at your page. If Penguin determines that your backlinks are spammy and manipulated then your site will see a major decrease in ranking as sites with quality linking structures are pushed ahead of you.

More recent updates to Penguin have had a fatal impact on private blog networks (PBNs), for instance. In general, Penguin has effectively changed the way a sheer number of backlinks to a site is valued and has forced webmasters to view linking structures as honest handshakes with relevant networks rather than an arbitrary numbers game.

Now, imagine these basic principles of Penguin being implemented in real-time. You can expect to see an immediate change in how your site is ranked upon gaining a link or removing one. The great benefit this will provide is the ability to identify and recover from a penalty immediately upon implementing a change to your linking structure. Of course, this will also create a world without mercy where penalties can accumulate faster than ever before.

Why does Hummingbird need A.I. on its team?

Every step forward in the aspects of Google associated with determining quality content and page relevance has been a natural necessity. It made sense that pages containing the exact phrasing of any given query were ranking until, of course, people started stuffing’ exact-match keywords to force these rankings. It then made sense to govern the number of acceptable instances of a keyword on a page, until people started inserting keywords amidst nonsense unrelated content.

In recent years, Hummingbird has evolved to tremendous capabilities in its ability to determine page relevance in an effort to provide user queries quality responses. But, ultimately Hummingbird is still dependent on instances of a keyword on a page. Yes, it is able to determine that a page containing instances of a particular keyword search phrase are absolutely relevant to the user. It is also capable of stemming’ variations of a keyword so that not-so-exact-matches populate the results while still containing some form of the keyword phrase.

But, is it always necessary to have the keyword phrase present at all? Isn’t it true that large authority sites hosting content from industry leading providers are not ranking because the content has not been prepared with keyword-density’ in mind? Should this content really be penalized?

For many queries out there, keyword-density will always be a valuable metric for the sake of the user. But there are a great number of queries where the amount of value placed on keyword instances on a page has resulted in users not getting the best possible sources of information displayed in their search results.

This is, reportedly, the issue Google intends to resolve with RankBrain, a form of Artificial Intelligence.’

RankBrain: A Machine-Learning Artificial Intelligence System

Google has not yet released a lot of information about the exact nature of RankBrain. But it’s intention is clear; learn about the pages to provide search results that have been tailored with more thought and less systematic response.

Machine-learning refers to a machine’s ability to teach itself about the data it is exposed to. RankBrain will go beyond programming and determine a page’s value by developing an understanding of what is on the page rather than just recognizing patterns of text and systematically determining relevance.

The goal is to provide users with results that may or may not have exact-match keywords or even variations of the inputted keywords. These results will, consequently, provide information of higher value from deserving sources that users would not have otherwise seen.

Should I Fear?

As always, the impending changes associated with these names will result in devastation for the rankings of many sites. This does not mean that you should come to fear these names and the changes they represent. For those out there providing quality content on sites that have been structured with honest SEO practices, these updates should only improve your rankings and success by pushing you ahead of those sites that never deserved to be on the radar.

The key is to be on the lookout for upcoming changes so that you can begin to update your SEO checklist in plenty of time to ensure maximum benefit once the change rolls out.

Manual spam action revoked

Webmaster Tools- Manual spam action revokedFurther to our Webmaster Tools unnatural outbound links manual action Reconsideration request last week, we have received another email from webspam team to say the manual spam action is revoked. This was not for one but three sites and non had ever accepted payment to publish content. One was heavily reliant on content from a guest blogging network that has reportedly been subject of Google network clampdown and the other two had one or two articles in the past. However, the sites were and still are listed in the blogging networks directory list as seeking content from guest writers.

On a positive note I have been approached by guest bloggers to publish content this week and they have agreed to do so despite there now being nofollow external links on the site. It is a shame the blogging network was subject to the action as it was a great meeting point for writers and publishers and I have personally enjoyed the relationships built.

[contact-form-7 id=”1139″ title=”Contact form 1″]

Anchor Text Variation Pre & Post Penguin

Anchor_DiversityA lot of link builders have been struggling with the correct ratio of anchor text after the Penguin update. Below is a suggestion by mMatt Cullen who Many SEO’s have a lot of respect for:

“Pre-Panda/Penguin, there was a pretty specific formula that would
work the majority of the time for ranking a website in Google.
Obviously, this was dependent on the competition too, but the key
strategies went something like this…

1. Get a bunch links – a lot of them

2. Get links with your keyword in them

The formula looked like this:

– 60% of your links should be your main keyword

– 15% of links are your main keyword + other words (eg: read about,
check out, etc)

– 15% of your links generic text (eg: click here, visit site)

– 10% of your links as the text (eg: your link URL used as the text)

And like a lot of people have experienced, when Google’s
Penguin update came around, it effected this “formula”.

BUT there’s one very good reason that some of our sites
didn’t get hit at all, or recovered extremely fast.  And it has
absolutely NOTHING to do with where we were getting the links from,
but more so with “how” we were  linking… i.e. the link
density and anchor text we were using.

We quickly realized that the old way of doing things needed to be
changed.  Slamming your sites with a bunch of keyword rich anchor
text is finished – it won’t work.

But the reason we’ve now rebounded is because of the following
two things:

1. It wasn’t the links that were the problem

Getting a ton of links still works.  Do I need to repeat that?

Most people froze and just stopped building links altogether
fearing that if they continued, their sites would get hit even
harder.

They were wrong.  You see I figured that if you suddenly stopped
business as usual, that it would be a sure-fire sign to Google that
you were up to something and then they would hit you even harder.

They were looking for a reaction on your part.  If something
suddenly changes, you’re likely gaming the system.  If your
links are in fact natural, well then why would anything change?

It wouldn’t…

So we got that part right and now we’re almost back to
pre-Penguin traffic levels and rankings.

But there was a second thing…

2. Anchor text links have CHANGED…

Our formula for the amount of keyword anchor text we used was now
defunct.  We didn’t change it right away (keeping with what I
outlined above), but we did slowly phase in a new formula.

The new formula goes a little something like this (obviously
adjusting as needed, based on our competition too):

20% – your main keyword that you’re wanting to rank for

25% – related LSI keywords (these are keywords related to your main
keyword.  Think of these as the “related” searches that
Google shows you as you’re typing in their search field).

35% – your URL as the anchor text, as well as other combinations of
“branded keywords”.

20% – generic keywords

Google is now looking for a more diverse linking profile – a more
natural one.”

Going from Google Penalty to Reconsideration

Google Penalty letterA website owner responded to our advert in panic leaving  several phone messages and email contact requests. He woke to receive the dreaded email from Google informing him his site has been penalised due to unnatural linking behaviour  If you have been doing some aggressive link building in the past then you may get a similar email telling you your site has been de-listed for suspicious behaviour. The question is what are the steps one should take to get out of the Google jail.

Below is the steps I took to go from the warning letter to a reconsideration request.

Download Backlinks

Find out what links could be causing the problem. Now, the problem here is that Google has been notoriously secretive at telling us what links it counts for our site. A Google search for site:yourwebsite.com shows very few links so we have to rely on other sources like Open Site Explorer and MajesticSEO to find our backlinks. Matt Cutts in his disavow tool launch video suggests that the warning letter comes with sample links found to be unnatural but there was no such hint in our clients email. You do get more backlinks information from webmaster tools though you have to dig a little. You have to jump over to your webmaster tools traffic tab then click on more tab under the “who links the most”. This will give you a list of all sites linking to you but you want the actual links. To do this you need to click the “download more sample links” tab which gives you a CSV list of actual URLs where you have backlinks.

Do a Backlinks Audit

The difficult part in sorting out the backlinks list is deciding which links are harming your site. There many places on the internet detailing what are considered poor links including

  • Bad neighbourhoods: links from gambling and porn/adult theme sites
  • Anchor text diversity: following from Penguin update anchor text proportion is on every SEO’s mind. MajesticSEO has updatedAnchor_Diversity their tools to show a nice pie chart of anchor text which you can use to ensure there is not too much focus on a one or two money terms instead of a brand term. Common thinking today is that your brand or website url should comprise 75-80% of anchor text with the rest being used sparingly among money terms.
  • An increase in links from one particular site: Again MajesticSEO is your friend in looking at backlink diversity. Webmaster tools can also be useful as it shows the total number of links from each site so you might want to start there. Unnatural high number of backlinks from one site can occur due to placing your site on the sidebar or footer which red flags Google to a paid link.

Link removal steps

Once you have got the clean list it is time to take steps to remove or discount them. Start by writing to the webmaster or site owner with a friendly email asking them to remove your link from their website. Most website owners should not have an issue with removing the links – especially if it means they will reduce number of links on page and get to keep the content of an article.

There are some webmasters who will not respond or try to hold you to ransom by asking for payment to remove a link. In this case Google has given us the new disavow tool to let them know you do not want to be associated with these links. To use the disavow tool you simply save the bad links list as a CSV and head over to the disavow tool and upload the list and hit submit.

Filing Reconsideration Request

Once you have taken steps to get bad backlinks removed or disallowed you can send a reconsideration request. You need to fill in the reconsideration form and submit to the Google webspam team and put as much detail as you can. The form is a mixture of a confession box and bad SEO’s reporting. It asks you to name any SEO who may have created these bad links for you. I guest this is one way for the webspam team to collate information and act where a one individual or company is constantly reported. You also need to provide details of all efforts you have made to get the bad links taken down and if you have asked Google to be dissociated with the links via the disavow tool.

After you fill in the request form you will get a message confirming you are being reconsidered for re-indexing. They do say it can take several weeks depending on amount of reconsiderations they are working on so don’t hold your breath!

 

Google Panda Update in UK

Yesterday Google rolled out its Panda update to all the rest of the world after trialing the algorithm on the US search market. The update affects websites in locations with mostly English speaking countries including the UK.

As discussed in February, Google released its Panda algorithm update in the US which saw up to 12% of searches affected in rankings. The main negative affects were to do with sites commonly labeled content farms which lead to some calling it farmers update.
Although initially large sites with more pages but now include smaller sites as well, Google confirmed that they think the algorithm change had a positive impact in the US which is comprised of 50% of its market. They therefore felt confident to rollout the change to the rest of the world as part of its quality update to improve user experience.

Google’s proprietary algorithm uses various matrix to determine relevancy of a site for search queries. This includes the html text on the page, content subject and anchor texts of links from other sites and page load speed. However, big G has also introduced ways for searchers to block results which they deem irrelevant to their search. Although still on trial, this may lead to searchers feedback being a ranking factor in the future so if many people block a certain site it may drop in rankings.

Google has also reiterated opening up of their webmaster help forum for anyone who feels aggrieved, as discussed before, while not
promising any manual review.

A list of the sites immediately affected by Panda UK update can be seen in search metrics blog here.

Google UK Panada Update

Limiting Google Panda Update Impact

Google farmer update was meant to improve quality of sight content from thin shallow to quality material that is of more use to visitors that Google sends to your sight. There are various advices on how you can improve your site to ensure it gets minimal negative impact from the Google panda update including the following:

•    Remove or improve poorly written content from all parts of the site

•    You can also update any material that is shallow with more depth.

•    You can block low quality or auto generated content through your robots.txt file or put it to another site or sub-domain.

•    Make sure your content is unique and adds value to what is already available online when adding content to any part of the site.

•    Try to get user generated content such as product reviews and customer feedback for ecommerce sights. Providing further product details in form of user reviews, videos and FAQ’s will also improve user experience which is part of stated remit of Google.

•    Link-out to other relevant sights to show you have genuine desire to educate your readers. For an ecommerce sight his could mean manufacturers website where the user can go to find more information.

•    Ensure web pages with a lot of graphics and video has adequate balance of html content

More details for ecommerce sights from further uk blog and webmaster radio.

Panda Update

Panda Update-Claiming Collateral Damage

Whereas it is easy to see how Google can conclude a sights content has been copied from elsewhere in its Panda updateit is hard to see how it can judge whether content is useful when the algorithm only sees words and cannot be expected to make sense of sentences.

However as well as hitting the intended internet spammers the massive update also hit a lot of innocent bystanders who have quality websites.

So what can you do if you feel you were innocently caught out in the spam fight crossfire? Well searchengineland confirm that it is not worth appealing to public opinion because Google has categorically stated this is an algorithmic change and not manual review. There is no white list or blacklist of favoured sites like in the case of Blekko so there is no need of feeling personally aggrieved.

Google admits mistakes can be made with any such change since no algorithm can produce 100% perfect results.However Google is keen to learn of any casulties in a quest to perfect the algorithm so so it has put a thread in their forum for those who feel they were unfairly affected. So if you – think you’re affected by the recent algorithm change? Post here.

Google’s Panda update Winners and Losers

The aim of the Google Panda update was to block sites that duplicate content or scrape from other sights and secondly to downgrade sights that publish content without much value.

Sistrix crunched through a million search queries and compared them to before the update and came up with the following list of top losers.

Sistrix panda list

A bigger list of affect sights is available at the following Google spreadsheet.

One of the major surprises for me was ezinearticles.com which is a favourite publishing platform for many SEO activists and has very strict human edited guidelines.

ezinearticles.com founder, Chris Knight, promises a major cleanup and making some drastic changes in his blog including removing low quality articles, incresaing rejection rate on some categories, increasing minimum word count to 400 and possible future universal nofollow tags.

Unique article wizard and its sister company uber articles claim they are one of the gainers with reported high increase in traffic.

uber articles increase panda update

Ehow.com was not only left unscathed but also experienced a traffic increase much to the surprise of many who claimed Google has effectively killed ehow’s competitors.