About Google’s real-time, A.I. updates

It’s that time again! Toward the end of last year, Google released news confirming major changes to be implemented to their algorithm, including a Real-time’ version of the spam fighting Penguin as well as a developing A.I. called RankBrain’ to team with Hummingbird.

Google’s algorithm is a multi-faceted and ever-evolving entity with infinite sophistication. If these descriptors resemble how you might characterize an omnipresent deity, then I would say you are considering the developing changes to Google’s search algorithm with an appropriate level of weight.

In recent years, Google has become much more transparent about impending changes to the algorithm and has even provided names for the inner workings responsible for specific tasks;

– PagRank and Hummingbird for the processes associated with distributing credit and authority to a page,
– Panda and Penguin for the developing processes aimed at battling spam and penalizing dishonest SEO practices.

What does a Real-time’ version of Penguin mean for me?

Presently, Penguin refers to the aspects of Google’s crawl that are aimed at locating and determining the validity of all links pointing at your page. If Penguin determines that your backlinks are spammy and manipulated then your site will see a major decrease in ranking as sites with quality linking structures are pushed ahead of you.

More recent updates to Penguin have had a fatal impact on private blog networks (PBNs), for instance. In general, Penguin has effectively changed the way a sheer number of backlinks to a site is valued and has forced webmasters to view linking structures as honest handshakes with relevant networks rather than an arbitrary numbers game.

Now, imagine these basic principles of Penguin being implemented in real-time. You can expect to see an immediate change in how your site is ranked upon gaining a link or removing one. The great benefit this will provide is the ability to identify and recover from a penalty immediately upon implementing a change to your linking structure. Of course, this will also create a world without mercy where penalties can accumulate faster than ever before.

Why does Hummingbird need A.I. on its team?

Every step forward in the aspects of Google associated with determining quality content and page relevance has been a natural necessity. It made sense that pages containing the exact phrasing of any given query were ranking until, of course, people started stuffing’ exact-match keywords to force these rankings. It then made sense to govern the number of acceptable instances of a keyword on a page, until people started inserting keywords amidst nonsense unrelated content.

In recent years, Hummingbird has evolved to tremendous capabilities in its ability to determine page relevance in an effort to provide user queries quality responses. But, ultimately Hummingbird is still dependent on instances of a keyword on a page. Yes, it is able to determine that a page containing instances of a particular keyword search phrase are absolutely relevant to the user. It is also capable of stemming’ variations of a keyword so that not-so-exact-matches populate the results while still containing some form of the keyword phrase.

But, is it always necessary to have the keyword phrase present at all? Isn’t it true that large authority sites hosting content from industry leading providers are not ranking because the content has not been prepared with keyword-density’ in mind? Should this content really be penalized?

For many queries out there, keyword-density will always be a valuable metric for the sake of the user. But there are a great number of queries where the amount of value placed on keyword instances on a page has resulted in users not getting the best possible sources of information displayed in their search results.

This is, reportedly, the issue Google intends to resolve with RankBrain, a form of Artificial Intelligence.’

RankBrain: A Machine-Learning Artificial Intelligence System

Google has not yet released a lot of information about the exact nature of RankBrain. But it’s intention is clear; learn about the pages to provide search results that have been tailored with more thought and less systematic response.

Machine-learning refers to a machine’s ability to teach itself about the data it is exposed to. RankBrain will go beyond programming and determine a page’s value by developing an understanding of what is on the page rather than just recognizing patterns of text and systematically determining relevance.

The goal is to provide users with results that may or may not have exact-match keywords or even variations of the inputted keywords. These results will, consequently, provide information of higher value from deserving sources that users would not have otherwise seen.

Should I Fear?

As always, the impending changes associated with these names will result in devastation for the rankings of many sites. This does not mean that you should come to fear these names and the changes they represent. For those out there providing quality content on sites that have been structured with honest SEO practices, these updates should only improve your rankings and success by pushing you ahead of those sites that never deserved to be on the radar.

The key is to be on the lookout for upcoming changes so that you can begin to update your SEO checklist in plenty of time to ensure maximum benefit once the change rolls out.