Google’s search ranking algorithm has undergone countless changes since its debut. In the past, nobody could predict all the possible methods to push low-quality sites to the top of search results, but Google dealt with them as they came – with the help of algorithm updates.

What are Google Algorithms?

Google’s algorithms are a complex system used to retrieve data from its search index and instantly deliver the best possible results for a query. The search engine uses a combination of algorithms and numerous ranking signals to deliver webpages ranked by relevance on its search engine results pages (SERPs).

Best practices to avoid being flagged by Google Webmaster Guidelines are to review the outlined rules and adjust your site if anything is not up to par. Having ads or promoting third party sites are not going to get you in trouble as long as they are posed in a natural way and the parties you are advertising are authoritative. In its early years, Google only made a handful of updates to its algorithms. Now, Google makes thousands of changes every year.

Let’s get started with one of the most important updates still today, the Panda Update.

1. The Panda Update

The Google Panda Update was released in February 2011 and is still being updated from time-to-time today, so obeying the rules laid out here is important for ongoing SEO success.

The Panda Update was the introduction of a filter to Google’s process. In January 2016, the filter was such a critical tool for Google, that it was officially incorporated into the algorithm. The Panda filter is in charge of syphoning out websites that have poor overall content from the search results or at the very least, preventing them from ranking well. The major flags that Panda will check for are duplicate content, thin content and keyword stuffing, meaning pages that stuff their text with target keywords.

What does it do?

This algorithm update is the most likely to strike you.

Google Panda evaluates websites based on the quality of their content.

Pages with high-quality content are rewarded with higher ranking positions, and vice versa.

It boils down to how good you are with on-page optimization.

What triggers the Panda?

  • Thin content. This doesn’t necessarily mean content with too few words. Need a demonstration? Type “is it Christmas?” in Google’s search bar and see what’s ranking first. The site checks the date and then just says Yes or No in your local language. I won’t encourage you to be laconic like a Spartan, though. When you create content, make sure it provides an explicit answer to the user’s search query.
  • Low-quality content. This means content that hurts you to even look at it, let alone read. Poorly formatted text with grammar errors, huge or otherwise distracting images, design that negatively affects a user’s experience – anything you suspect will rub users the wrong way, will. Their visit to your site should be enjoyable.
  • Unhelpful, untrustworthy content. The kind that doesn’t help the users who found it or causes outright harm. Google has no tolerance for incompetence and con artistry. Strive to be a positive force.
  • Duplicate text. It’s often referred to as “duplicate content”, but Panda really only frowns upon copied chunks of text. Images are fair game. Videos are fair game (except on YouTube). Text is where you should be careful. It’s OK to reuse small bits of text as quotes – if you properly mark them as quotes in context. Reusing text and passing it off as your original work, is a no-go. Do that on enough pages to hamper the quality of your site, and Panda will take action.
  • Article spinning. This refers to attempts to avoid issues with duplicate content by rewriting text from another site. Unfortunately for those who try it, good content also needs to be original, and spinning often lowers the content’s quality as well (especially if you automate the process with software).

How to recover?

Are you positive your site was hit by Panda? Then your course of action is to improve your content’s quality.

If it’s obvious to you which pages need more work, overhaul them: remove all that offends users and the algorithm and put up more of the things deserving approval.

In order to prevent your site from getting penalized by Panda, several practices should be kept to ensure your content remains quality:

  1. Eliminate duplicate content from your site, which could exist in duplicate pages and/or similar content on multiple pages.
  2. Update all pages on your site to have at least 800 words of content and refresh this content a periodically.
  3. Avoid any keyword stuffing or use of keywords that seems unnatural.

2. The Venice Update

The Venice Update went live in February 2012 within a list of updates to the current algorithm. Frequently overlooked, this update is an important guideline for local SEO tactics.

Venice is an update to Google’s algorithm that improved search results for local queries. Until this point, if a user wished to receive location based results from their query, they would need to attach the appropriate modifier to their search (eg. for restaurants in London, the search would need to include the word London). After the Venice update, Google began using a computer’s IP address as well as the user’s physical location to aid their results for searches that had a local intent but were not obviously typed out (eg. if you searched for ‘good italian restaurant’, Google could understand that you meant not only a good italian restaurant but results close to you). See an example below for a search for ‘seo agency’ in the San Francisco area below:

Further updates known as the Possum and Pigeon Updates, have been made to compliment the Venice Update since 2012. These updates have further honed in on results that are closest to you showing up for your query. Other improvements were also made to help remove new local spammers who had found loopholes in the Venice update.

All in all, if you have a page that survives off local traffic, it is important to ensure your pages are reflecting that intent. Optimizing your meta data and on page content with local references is a good start to seeing higher rankings from a local audience.

3. The Penguin Update

The Google Penguin Update was first introduced in April 2012 and has since been updated a number of times. This update unlike Panda, is a real-time part of Google’s algorithm.

Penguin is a piece of software that focuses on backlinks. This was an update that was overdue from the early days of PageRank and has caused many black hat SEO specialists a major headache. The essence of this update is that any links that Penguin deems spammy or manipulative are given zero additional value to a website and will result in the respective website being given a penalty. In earlier versions of Penguin, these links would have devalued the website on top of being penalized, but Google noticed that anti-competitive sites were targeting manipulative links at their competitors and made the change.

What does it do?

This is the second algorithm update most likely to hit you. Penguin has a lot in common with Panda, but it evaluates websites for a different factor: their link profiles. Backlinks positively affect a site’s rankings if:

  • They are placed on pages contextually related to your linked pages
  • They are surrounded by content related to your linked pages
  • They point to you from trustworthy sources
  • They come from multiple different domains

Conversely, dubious links from shady sources will negatively impact your rankings. Penguin makes sure of that.

Important note: Google Penguin is not the same as Google’s manual actions for unnatural linking. Penguin is completely automatic and will let its grip on your site when unnatural backlinks are no longer a factor. To deal with a manual action, you’ll need to submit a reconsideration request in addition to purging those links.

What triggers the Penguin?

  • Buying links. It’s a violation of Google’s Webmaster Guidelines to acquire links that pass PageRank in exchange for money or products.
  • Lack of anchor text diversity. Text inside backlinks is another factor affecting the quality of your link profile. If this text is the same everywhere, it will look to Google like an attempt to manipulate your rankings.
  • Low quality of links. A backlink will set Penguin off if the content surrounding it is low-quality or contextually irrelevant to the linked page. You can’t always control who links to you, but you should do all you can to get rid of links that harm you.
  • Keyword stuffing. Surprise! You’d think this would be Panda’s territory, since keywords are on-page content. But Penguin also watches for an unnatural use of keywords. Have you ever encountered pages with long, near-meaningless sentences filled with dozens of search queries? That’s what keyword stuffing looks like at its worst.

How to recover?

If the problem is in the backlinks department, you should dig through the ones you have.

The easiest way to do this is to scan your site with WebCEO’s Backlink Quality Check tool.

Once you’ve found all the bad apples in your basket, take them down through any means available.

If you are able to remove them manually, do it. If you can talk to the person who manages the linking domain’s content, do it. For cases when these two options can’t work out, there’s the Google Disavow tool.

Then get the keywords on your site in order if you’ve messed up with them, too. Reduce their numbers until the text looks natural everywhere.

You can scan your site’s pages with WebCEO’s Landing Page SEO tool to check how much of their content in percent is keywords.

To avoid being penalized by Penguin, the following measures should be taken:

  1. Quality over quantity: avoid low authority domains when searching for new links. A good way to ensure the domains you are sourcing from are trustworthy for links is to enter the URL on Majestic and locate the scores for Citation Flow and Trust Flow. The rankings are between 0-100, a score in double digits is respectable, and the scores should be relatively close to each other.
  2. Diversified anchor text: when creating anchor text for a link, the text should avoid always being keyword rich or Google will flag this as keyword stuffing. Instead, focus on diversifying your anchor text to make a quality backlink.
  3. Avoid buying links or using tools to create backlinks: both of these tactics will land you in partnership with low quality sites that are known for black hat tactics. Creating a strong network of links is not accomplished overnight and requires a good amount of work and Google knows that.

4. The Hummingbird Update

In August 2013, Google shook up the game and changed the core of their algorithm. This major update came to be known as the Hummingbird Update.

This update, unlike Panda and Penguin, was less about identifying and penalizing black hat techniques and more about improving Google’s search results. The idea was to better understand user’s search intent with their queries and provide them with more relevant answers. This meant expanding results past those that just matched on-page keywords to include latent semantic indexing, co-occurring terms and synonyms. By employing an advanced language processing algorithm, more low quality content was cut out and results pages were filled with more relevant pages than ever.

What does it do?

Unlike Panda and Penguin, the purpose of Google Hummingbird wasn’t to change how websites are ranked – at least not as directly.

Hummingbird aimed to improve search itself: by interpreting the user intent behind a query, it made the algorithm return webpages that would be the most qualified for the task. The context around keywords became just as important as the keywords.

What lies in post-Hummingbird SEO?

Hummingbird started the era of semantic search as we know it.

How to meet its standards?

The key lies in understanding what exactly users want to find when searching online.

Most of the time it’s obvious, especially if the query is in the form of a question. Provide answers in your content and be generous with details, synonyms, and contextually related words.

It’s highly recommended to thoroughly research the subject before you write about it; that way, you will possess all the necessary vocabulary and the means to use it correctly.

Be careful: the point is to help your audience, not confuse them. You don’t want to come off as a pseudointellectual who tries too hard to fit in.

Where to find semantic search-friendly keywords and phrases?

Wikipedia is a great example of a site optimized for semantic search (and it was even before Hummingbird). Thanks to being rich with information, its articles almost always satisfy user intent behind one-word and “what is” queries – because that’s precisely what Wikipedia is for. The same is true for other search results that appear for such queries.

As mentioned above, the Hummingbird Update was less about catching spammy users and more about improving Google’s results pages organically. In order to be successful with respect to Hummingbird, ensure your content is natural and you are targeting key themes, not just individual keywords.

5. The Mobile Update (Mobilegeddon)

In April 2015, Google launched their Mobile Update which is as straightforward as it sounds.

This update punished sites that lacked a mobile-friendly version of their website or had poor mobile usability. Thus if a search was made on a mobile phone, results with a mobile site were given higher priority and sites that did not were pushed down results pages.

The solution here is as straightforward as the issue – if you haven’t already, launch a mobile version of your website. Once launched, you can utilize Google’s Mobile Friendly Test to see if your site is mobile friendly and has any usability issues. An additional update in July 2018 was released by Google discussing how page speed will now be an official factor for ranking. So in terms of best practices, usability and speed are both important considerations.

What does it do?

One fine day with a boom and a blur, an update rolled out and it caused a little stir. Despite the scary name it received from SEO pros, sites didn’t crash and burn.

All Google did was introduce a new mobile search ranking factor: the user experience quality when viewed on small screens.

Such an innovation was spurred by a significant increase in the number of searches being conducted on mobile devices. Google had a hunch we were were heading toward a mobile-first world – and was completely right. The need to adapt their search algorithm for devices other than PCs was justified.

This mobile-friendly update, a.k.a. Mobilegeddon, arrived in 2015, and ever since then, there has been talk of a new, separate index for mobile-friendly websites. It finally saw the light of day in 2018, and sites that prepared for it early were promptly added in this new index.

What lies in post-Mobilegeddon SEO?

A good start would be to check if your site is already mobile-friendly. You can find out by using WebCEO’s Mobile Optimization tool.

Suppose the result is negative, or you feel like you can do more. How do you optimize your site for this Google update?

A website needs to meet certain requirements to be considered mobile-friendly. Replace “mobile” with “user” and you can easily tell what half of them are; after all, mobile SEO is primarily user experience-oriented.

Let’s see how many of these you have guessed!

  1. Responsive design. It is possible to design a website in such a way that it will automatically adjust itself to any screen it’s displayed on, removing the need to zoom in or scroll sideways.
  2. Large font. Another way to save the users’ time is to make your text larger than you’d normally do for a PC screen. Consider making it even larger above the fold, where you are supposed to catch the visitor’s attention and motivate them to keep scrolling.
  3. No unplayable elements. Who wants to see messages like “this content cannot be played on your device”? They make users feel like they are missing out. Consider the types of content that certain devices don’t support (e.g., Flash has this kind of relationship with mobile) and avoid using them.
  4. No intrusive elements (with a few notable exceptions). Popups that all of a sudden cover the site you’ve been browsing peacefully are a UX killer. If you need to show ads or tactfully hint to your visitors that they can subscribe and get something good, do so in a way that will leave most of your content still visible and usable. Users will much appreciate if you make your popups easy to close, too. Exceptions are notifications that have a good reason to block content, such as age verification popups.
  5. Space between interactive elements. If there are any buttons, checkboxes or the like on a page, make them large enough so they’d be easy to press and leave some room between them so the users with big fingers won’t press the wrong one by mistake.
  6. No separate website. Frankly, having one won’t hurt your SEO or rankings, but you’ll save time and effort if you don’t create one. Just focus on making your primary domain mobile-friendly instead of working on an m-dot.
    – If, for whatever reason, you need to have a separate website, make sure all your links between them are working properly. It’s not uncommon for a faulty redirect to lead to the other site’s home page instead of where the user wanted to go.
  7. Loading speed. Google has finally announced their plans to make web pages’ loading speed a ranking factor. It’s even more important to sites that want to be mobile-friendly, since mobile users tend to close a slow-loading tab sooner than most. How does one make their site load faster?
    – Image optimization. Images take a toll on the loading speed due to all the kilobytes (or megabytes) they are packing. It’s therefore necessary to reduce their file size as much as possible while preserving their quality. Pick the most optimal formats for your images and compress them with specialized software and services.
    – No excessive code. The less code in a page, the faster it’s parsed by the browser. Make your pages do their job with minimum HTML, CSS, JavaScript and other kinds of code. A special example of this method is AMP (Accelerated Mobile Pages), which eliminates JavaScript code completely and minimizes the rest, achieving blink-of-an-eye speeds.

6. The RankBrain Update

RankBrain was released in October 2015 as a compliment to the Hummingbird algorithm. Although we don’t know exactly how RankBrain works, we do know that Google has publicly stated that it is the third most important factor for ranking.

RankBrain is a machine learning system that acts as a query processor to further assist Google in understanding search queries. It is believed that RankBrain is continuously recording and backstoring written and verbal queries and processing them into potential intentions.

RankBrain is the only live Artificial Intelligence (AI) that Google uses in its search results. While Google uses machine learning to teach the algorithms, AI isn’t being used in the wild – and for good reason. If search broke, Google’s engineers would have no clue how to fix it.

RankBrain, however, is used to sort live search results to help give users a best fit to their search query.

RankBrain as a Ranking Signal

RankBrain has been called Google’s third most important ranking signal (behind content and links).

But is RankBrain really a “ranking signal”?

Not really. At least not in the way we think of traditional ranking signals.

RankBrain is a method of processing search queries in a way that infers a “best fit” for queries that are unknown to Google.

About 15 percent of the queries Google processes every day are new – in other words, nobody has ever searched using these exact terms before.

How can there be so many unknown queries? It’s a hard concept to wrap your brain around.

But if you think about all the different ways we talk about a person, place, or thing you can quickly see how there could be millions of way to ask even one simple question. This will likely even expand exponentially as we move more to voice search as smartphones get better at Voice to Text and devices move into the home that take only voice.

So, in the simplest terms, RankBrain is a processing algorithm that uses machine learning to bring back the best match to your query when it isn’t sure what that query “means.”

At first, RankBrain was only present in a small number of Google queries (about 15 percent). However, over time, it has expanded and is involved in almost all queries entered into Google.

That being said, if Google is sure of the query meaning RankBrain has very little influence. RankBrain is only there to help when Google is unsure of the queries meaning.

The strategy for remaining successful with RankBrain does not differ from Hummingbird but simply puts an even larger focus on ensuring your website is searchable, user-friendly and has an appropriate amount of content throughout. A wide range of keywords and supporting backlinks on authoritative partnering websites would also prove to be beneficial.

7. The Fred Update

The latest Google update came in March 2017 with the coined name Fred. Fred’s goal is to identify websites that are violating Google’s Webmaster Guidelines and send a warning (example below) and subsequent penalty if the issue is not resolved. These guidelines are created to prevent many tactics but the majority of flagged sites are dealing with content issues. These content issues can be anything from thin content with an obvious attempt to upsell to pages that are covered in advertisements.

Fred was different than any other “named” algorithm updates because it doesn’t actually mean anything except that Google updates the core algorithms to deal with quality issues and they only gave it that name as a joke.

Now Google does many of these quality updates and most go unnoticed.

An update only gets the Fred name if it has significantly affected a large number of sites or a certain vertical and SEO professionals want it identified.

Basically? All unnamed updates related to quality are now “Fred.”

Google algorithm updates aren’t always the most captivating reads. But needless to say, they are important to be up to date on. If any of the updates above have you worried that you may be being penalized, a good resource to check is the Barracuda Tool which will help you investigate if you’ve been affected. After going over each of the major updates above, we can see a couple overarching themes that can help us be successful:

  1. Google is transparent about SEO and wants your site to be impactful
  2. SEO can be complicated and difficult to master, but knowing just a few best practices goes a long way

Its common to fall in to the feeling of looking at Google as the unfair ruler of the search engine world. But Google really does want your website to be as good as it can be. Their success and relevance depends on it. When someone enters a keyword search, they expect Google to give them the answer to their question instantaneously regardless of how many ways their search could potentially be interpreted. That can be a lot to ask. So in order for Google to pull off such a trick every time, they require our help in making our sites as easy to crawl, understand and categorize as possible. And in return, Google will rank your site higher and higher up their results pages.

Further proof of Google’s support came in 2018 when Google announced the top 3 search ranking factors. By announcing the three biggest factors for ranking, that not only shows that Google is more of an open book about SEO than we may have assumed but it also allows us to hone in on these factors and optimize our websites to rank as highly as possible. Diving into those ranking factors, it is no surprise that content and RankBrain are both in the top 3. Both of these factors stress the importance of your site being quality and natural, which is basically Google rewarding the best sites with the best content. Finally, seeing backlinks ranking so highly is essentially telling us that Google is using the internet to grade itself. Backlinks can essentially be seen as votes of confidence for a website so the more votes a website gets and the higher the authority these votes come from shows Google that the site in question is legitimate.

At the end of the day, SEO can be a daunting task to master for your website. But knowing what Google considers important, avoiding black hat SEO techniques like buying links and keyword stuffing, and monitoring what is on your site and who is connected to it are all easy tasks that can get your site ranking. From here, it’s key to find an SEO agency to partner with for the long haul. Take your time to review and choose the best SEO agencies and decide on a partner who understands your business objectives, and has a proven track record of success with customers in your industry.

How to Watch for Google Algorithm Updates

Google rarely announces its algorithm updates. And even when it does, it’s usually only after others have discovered them. (Although this may be changing, at least for their core updates.)

With so many tweaks going on daily, it is possible that Google doesn’t know that some changes will be significant enough to mention.

Often the first indication you have is your own website. If your search traffic suddenly jumps or dives, chances are good that Google made an algo update that affected your search rankings.

Where can you go for information when your online world gets rocked? Here’s what I recommend …

Have a “seismograph” in place on your website.

To detect search traffic fluctuations on your own website, you need analytics software. If you haven’t already, install Google Analytics and Google Search Console on your website. They’re free, and they’re indispensable for SEO.

Watch the SERP weather reports.

Various websites and tools monitor ranking changes across categories and search markets and report on SERP volatility. Here are places you can check for early warning signs of a search ranking algorithm update:

Follow industry resources.

I’m always reading as an SEO. For the latest Google news, I recommend that you: