Google Penguin is now running in Real Time

google-penguinOn Friday, September 23rd, 2016 Google announced that the Penguin algorithm had released its newest update. This is big news for site builders and managers who have been affected in the past. Just as Google did with Panda when it became part of the core algorithm, Penguin now works in real-time.

The Penguin filter was first released in April of 2012 as a standalone program. The goal of Penguin was to catch sites deemed to be using spammy link building practices and penalize them. This algorithm helped to discourage many low-quality link building techniques that were popular at the time.

This update also means that Penguin will join Google’s core algorithm. It will be added to over 200 other signals in the main algorithm. Penguin 4.0, as its called, is the sixth major update Google has confirmed. Google also stated they will not announce or confirm future updates for either Panda or Penguin.

There are two big changes that come with this update.

Penguin now runs in Real Time

Since it’s release Penguin updated on a very sporadic schedule. Because it ran outside of Google’s core algorithm it had to be manually refreshed. This led to the list of sites impacted by Penguin going months without getting updated. Because of this, penalized sites could wait a very long time for link fixes to count in Google’s ranking algorithm. Now, Penguin will update on the fly.

Now that Penguin is running with the rest of the core algorithm data should update when pages are recrawled or reindexed. Webmasters can now use a handful of programs that will ask for Google to crawl a site. It is possible penalties will now disappear in hours, not months. On the flip side, gaining a few spammy links could quickly lead you to be penalized.

Penguin is more granular

Google was not exactly clear with this section of the update. The general belief though is this means pages will be evaluated on an individual basis. This will be a change from earlier versions which looked at domains only.

This will be interesting to watch as data becomes available. Sites with only a few pages utilizing spammy link tactics could see an overall boost in traffic. On the other hand, sites that received penalties from the original algorithm could receive even more penalties now.

What does this Update Mean?

Many, many site owners felt a direct impact from Penguin 3.0 and could instantly see improvements in traffic. By the time you read this, hopefully, you already have. Analyzing the effect of Penguin will be more difficult now, though. Site owners will now need to go through the site page by page.

If you know that your site has a Penguin penalty looming now is the time to fix it. There are a handful of tools online that grade links by spam score. Once you’ve identified your spammy links, use “disavow file” to get rid of anything with a high spam score. Within days you should see a big increase in organic traffic.

Overall, this update is not a surprise to anyone. Google will continue to take steps to guarantee the results that are ranking on the first page are of the highest quality. As long as site owners produce high-quality content and acquire links in an authentic way the Penguin penalty will not be a concern.

Keyword Research- 5 Common Mistakes to Avoid

Keyword research is very important to SEO professionals, as it is through this that they are able to generate traffic to websites and ultimately earn money from them.
By researching on your niche’s keywords, you not only get exposed to your consumers’ needs but also understand your potential customers and build solid relationships with them. It is however important to emphasize, that you just don’t want any kind of visitors; you want to have the right kind of visitors.
Keyword research has been deemed to be pretty easy, but this not always the case and as a result tons of mistakes are made. So, if you’re just venturing into SEO writing or have never really understood the keyword research process, here’s what you need to avoid:

1. Don’t Ignore the Long Tail Keyword

Long tail keywords are basically the three of four words used to describe your product vividly and very specifically to your consumers. Most SEO writers mistakably search for webpages for with high volume keywords. Well, it only makes sense, where there is a high search volume; there are more potential clients, right? Well it’s enough already to have to deal with a keyword with 5000 searches a day. These popular search terms only make up thirty percent of the total web’s searches. The other seventy percent is comprised of the long tail keywords, and that’s what you ought to target. You want to use longer, very specific tail words with low search volume instead. So once your consumer inputs this on their search engine, they are more likely to find you. Long tail keywords convert better as they catch people later in the buying/conversion cycle.

Source: Moz

2. Ignoring current searches

Most SEO writers tend to overlook current keyword search rankings. It is extremely important to consistently know where the page ranks, that way, you will know which type of content is being ranked every day. So before ranking your webpage, check to see that it fits in with the content ranking in the search results. Then evaluate the search results for competiveness before attempting to rank your webpage for a keyword phrase.
3. Ignoring Click-through Rates
Most SEOs target high volume keywords. However, low volume keywords with high click through rates are more likely to attract new customers. Take for example, a key word like, using a search word like perfume’. Millions of results will be yielded. Consider another keyword Japanese Cherry Blossom Perfume’. A person making this such is very specific with what he/she wants is likely to find it at your site if you’ve used such a keyword. He/she is more likely ready to purchase from you. It is extremely important to test the click-through rates on the web before investing any of your resources.

 

4. Ignoring the Adwords value of a keyword

Adwords value is more applicable to commercial keywords. Adwords value help give an indication of how valuable and competitive certain keyword phrases are. Advertisers in most cases invest their money on certain keyword phrases with the simple premise that they are acquiring customers from these keywords. The higher the value of the keywords, the more they are willing to pay. Take some time and find out which keywords are valuable to advertisers, what their value is before trying to rank your content for any keyword phrase.

 

5. Mistaking informational search with commercial intent

When a person makes a specific search, he/she could be looking to buy or gather information regarding that search term. Both are totally acceptable. So consider if you’re going to use your website to provide information or for commercial purposes and decide upon the right keywords to use. Do a proper search for the keyword phrase you are trying to rank for to see what kind of results show up. If you are trying to rank an informational article and only commercial results show up or vice versa, you may want to re-evaluate your options.

Best practices for bloggers reviewing free products

As a way to market themselves online, some companies offer blogs- that either partly or fully lend themselves to product reviews- free products in exchange for favourable online reviews and occasionally testimonials. The situation is widely perceived as win-win with the blogger benefitting from the free product while the favourable review increases the company’s online recognition and exposure.

Free product review blogs came to the fore in the US when the Federal Trade Commission published guidelines requiring that bloggers disclose the full nature of their relationship with the companies behind the products they advertise.

The success of free product reviewing as a means of advertising has increased the need for the regulation of this kind of blogs.

Google, on March 11, published a notice on their webmaster blog highlighting guidelines to bloggers receiving free products from manufacturers on best practices’ to avoid getting penalized by the search engine.

Failing to meet the guidelines could even end up causing some of the blogger’s articles not to show on Goggle search. The three practices are listed below.

Use nofollow tags on the link

The nofollow tag was first introduced in the early 2000s as an instruction to search engines not to follow the particular link. When manufacturers offer free products to blogs in exchange for reviews and testimonials, they request that the blogger links back to their company home page, their social media pages or affiliate pages.

Google required that bloggers use nofollow tags on such links because they were not organic. According to Google, since the company had to provide a free product or free service or pay for the links to exist, then they were artificially generated. The blogger, however, is not required to introduce nofollow HTML attributes for everything on their blogs; just the affected links.

Google also suggested that since the penalties for contravening this guideline would be far reaching enough to affect even the sponsoring brand’s online visibility, the companies needed to play their part in by reminding bloggers to appropriately use the nofollow tags.

Full disclosure of the relationship

The guidelines also required that users be informed when they are viewing sponsored content. This is already a legal requirement in some countries which require that bloggers disclose when content is sponsored. The purpose of the disclosure, along with following Google’s guidelines, was also to inspire trust from them.

Google also recommended that the disclosure of the relationship between the blogger and manufacturer be placed at the top just in case users didn’t finish reading the whole article.

Create Original content

It is important as a content blogger aiming to create a successful niche blog that you offer visitors on their blogs a compelling reason to come back. This can be done either by providing useful exclusive well-researched content that no other blog has to offer, or by covering a unique niche that you are the go-to source of information.

In typical Google fashion, the search engine sent out penalties to product review blogs during the weekend running from April 8th to April 10th, just a few weeks after putting up the notice. Google sent out manual actions to the bloggers that failed to take heed of the guidelines posted on their webmaster blog.

About Google’s real-time, A.I. updates

It’s that time again! Toward the end of last year, Google released news confirming major changes to be implemented to their algorithm, including a Real-time’ version of the spam fighting Penguin as well as a developing A.I. called RankBrain’ to team with Hummingbird.

Google’s algorithm is a multi-faceted and ever-evolving entity with infinite sophistication. If these descriptors resemble how you might characterize an omnipresent deity, then I would say you are considering the developing changes to Google’s search algorithm with an appropriate level of weight.

In recent years, Google has become much more transparent about impending changes to the algorithm and has even provided names for the inner workings responsible for specific tasks;

– PagRank and Hummingbird for the processes associated with distributing credit and authority to a page,
– Panda and Penguin for the developing processes aimed at battling spam and penalizing dishonest SEO practices.

What does a Real-time’ version of Penguin mean for me?

Presently, Penguin refers to the aspects of Google’s crawl that are aimed at locating and determining the validity of all links pointing at your page. If Penguin determines that your backlinks are spammy and manipulated then your site will see a major decrease in ranking as sites with quality linking structures are pushed ahead of you.

More recent updates to Penguin have had a fatal impact on private blog networks (PBNs), for instance. In general, Penguin has effectively changed the way a sheer number of backlinks to a site is valued and has forced webmasters to view linking structures as honest handshakes with relevant networks rather than an arbitrary numbers game.

Now, imagine these basic principles of Penguin being implemented in real-time. You can expect to see an immediate change in how your site is ranked upon gaining a link or removing one. The great benefit this will provide is the ability to identify and recover from a penalty immediately upon implementing a change to your linking structure. Of course, this will also create a world without mercy where penalties can accumulate faster than ever before.

Why does Hummingbird need A.I. on its team?

Every step forward in the aspects of Google associated with determining quality content and page relevance has been a natural necessity. It made sense that pages containing the exact phrasing of any given query were ranking until, of course, people started stuffing’ exact-match keywords to force these rankings. It then made sense to govern the number of acceptable instances of a keyword on a page, until people started inserting keywords amidst nonsense unrelated content.

In recent years, Hummingbird has evolved to tremendous capabilities in its ability to determine page relevance in an effort to provide user queries quality responses. But, ultimately Hummingbird is still dependent on instances of a keyword on a page. Yes, it is able to determine that a page containing instances of a particular keyword search phrase are absolutely relevant to the user. It is also capable of stemming’ variations of a keyword so that not-so-exact-matches populate the results while still containing some form of the keyword phrase.

But, is it always necessary to have the keyword phrase present at all? Isn’t it true that large authority sites hosting content from industry leading providers are not ranking because the content has not been prepared with keyword-density’ in mind? Should this content really be penalized?

For many queries out there, keyword-density will always be a valuable metric for the sake of the user. But there are a great number of queries where the amount of value placed on keyword instances on a page has resulted in users not getting the best possible sources of information displayed in their search results.

This is, reportedly, the issue Google intends to resolve with RankBrain, a form of Artificial Intelligence.’

RankBrain: A Machine-Learning Artificial Intelligence System

Google has not yet released a lot of information about the exact nature of RankBrain. But it’s intention is clear; learn about the pages to provide search results that have been tailored with more thought and less systematic response.

Machine-learning refers to a machine’s ability to teach itself about the data it is exposed to. RankBrain will go beyond programming and determine a page’s value by developing an understanding of what is on the page rather than just recognizing patterns of text and systematically determining relevance.

The goal is to provide users with results that may or may not have exact-match keywords or even variations of the inputted keywords. These results will, consequently, provide information of higher value from deserving sources that users would not have otherwise seen.

Should I Fear?

As always, the impending changes associated with these names will result in devastation for the rankings of many sites. This does not mean that you should come to fear these names and the changes they represent. For those out there providing quality content on sites that have been structured with honest SEO practices, these updates should only improve your rankings and success by pushing you ahead of those sites that never deserved to be on the radar.

The key is to be on the lookout for upcoming changes so that you can begin to update your SEO checklist in plenty of time to ensure maximum benefit once the change rolls out.

Adwords management app for Android

Google has finally launched an Adwords management app for Android today. It is the first time Google has launched an app for monitoring Adwords. The app, available from the Google play store, monitors vital metrics via a dashboard. It shows impressions, clicks, CTR, cost and conversions if set up. You can explore metrics further by Adgroup, day of the week and device. Yu can also do some PPC management such as updating bids and budgets. It also gives you campaign suggestions which you can activate through the app. The Adwords app is available from the app store for devices using Android 4.0 or later. An IOS version is in development.

Mobile algorithm update

Google made an unprecedented announcement that it was splitting its desktop and mobile ranking algorithm from 21st of April. The announcement is unusual as it puts an exact date on an algorithm update.

This is the highlighted excerpt from Google webmaster central.

Google-blog-about-mobile-friendly-search

As an update Google Webmaster Trends Analyst John Mueller answered questions in the weekly webmaster central hangout.


Before now SEO experts were using the same practice for both desktop and mobile optimisation – SERP was SERP regardless of desktop or mobile platform. Google has always urged webmasters to make their website mobile friendly in view of the growing large shift of traffic from mobile devices. The announcement says that from 21st April non mobile friendly sites can expect a drop in rankings.

Any Effective on Desktop?

Whether or not this mobile algorithm update will have any effect on desktop remains unclear but many people think there is a link now or in the near future. What was also not clear is whether the mobile-friendly update will apply site-wide or on a page by page basis. The difference will be important to websites that have taken to Google’s preferred solution of dynamic website serving or separate mobile site. The distinction could be important to people knowing whether their site will be hit site-wide or just a few pages.

Responsive web design the way to go

What this means is that we cannot emphasise any more times about the importance of responsive design solutions. The problem is solved because responsive sites have an exact one-one relationship between desktop and mobile. This has been the stated preferred solution of Google so we can assume responsive design themed websites will see an immediate boost compared to other solutions. Google has also gone a long way in supporting webmasters to make their sites more mobile-friendly through schemes like Mobile Usability Report to Google Webmaster Tools. Google has released a mobile friendliness testing tool to further assist webmasters.

Chrome Developer Tools Emulation Mode

Testing our websites has become a complex exercise for webmasters. Back in the day it was complete after verifying functions of a couple of browsers. However new websites must now be tested on a variety of platforms including mobile, tablets, desktop devices as well as various operating systems and screen resolutions and capabilities – touch-screens and HD displays..

In some cases it can take the same amount of time as website development. Building code on just a PC will not allow you to appreciate the various other devices that consumers might consume media. The challenge is knowing how to test various systems without the need to switch amongst various devices.

 

Chrome Emulation mode

Happily the latest version of Chrome V32 (google.com/chrome) has come to our aid with its new Emulation mode in developer mode.

This helps to identify how your masterpiece will render on various devices without leaving your PC You just need to start the browser, navigate to the website you are testing, open up Developer Tools and choose your emulation setting from a plethora of devices, platforms and resolutions.

 

Devices

The devices section has all the most popular devices such as iPhones, iPads, Nexus Tablets, Samsung Galaxy devices etc

Screens

the screen setting lets you set things like Apple Retina display with twice resolution of viewport User Agent This will show how your work will be viewed in every device if you wish to go that deep.

Sensors

Sensors this gives you device emulation such as touch, geoloaction and even accelerometer

 

User Agent

The user agent can be tricked enabling client and server-side code to act as apprpriate. This save you enabling the user agent switcher Chrome addons.

 

Chrome Developer Tools Emulation Mode is not perfect so give it a go and perhaps test on real life devices as a double check.

Taxi quote & booking system Plugin

We built a taxi quote and booking WordPress plugin after searching online unsuccessfully for an off the shelf solution.

Plugin front-end

The plugin has the following features for the potential customer:

  • Obtain a fare quote by entering address or postcode of pickup and drop-of point.
  • Book single or return journey
  • Enter full address of from and to journey details
  • Select type of car from saloon, estate, executive, MPV and 8-seater.
  • Pay online by PayPal or select to pay the driver on the day.
  • The system sends booking details by email to both customer and administrator and SMS text alert s to admin.

 

Plugin back-end

The backend admin section allows you to control the display map by longitude and latitude.

  • Change size of map and zoomming level
  • Choose journey rates for different car typesChoose currency
  • Configure terms and conditions to be displayed
  • Configure SMS alert through Textmarketer service
  • Configure PayPal email address

Plugin Payment and Download

Please click this download button to visit the plugin sales page for more information and payment details –

download_now

Any questions? Email: ericmfilter-taxi[at]yahoo.com

[BOOKTAXI]

How Google Ranks a page without links?

Google’s webmaster tool video dealt with this question in a recent video.

When Google encounters a page without links it judges it by the content and the keywords it finds. The first instance of the keyword indicates the subject matter then several repetitions confirms that page is about that subject matter. However, there comes a point where the repetition is judged as keyword stuffing and a negative effect occurs. If the page is about a very niche topic or phrase then it has the potential to rank better for that non-competitive keyword than a highly searched phrase.

The cost of world cup defeats

comicbookguy-worst-episode-ever

The real cost of losing a football match early in the World Cup goes far beyond. It affects mood which makes people less willing to spend.

Businesses like pubs take a big hit as the crowds fail to turn up to cheer their team and consume drink and food. For many marketingcompanies departments spend months planning for a long spell of world cup fever, but that is put into serious jeopardy when the home team goes home early.

The BBC reports up to 0.34% and 1% from UK and Spanish stock markets following shock of world cup football defeats. More on this story at BBC News – World Cup 2014: The real cost of losing.