What does Google Hummingbird mean for Marketers?

Google hummingbird

This post was originally published on the TrinityP3 Blog.

There has been a massive shift in the way search works over the past couple of years. You could say that the internet is starting to grow up a bit. And one of the biggest innovations in search has been rolled out in the last couple of months – a new algorithm known as Google Hummingbird.

Let’s take a look at a bit of background first.

Any major changes in search will have a significant impact on anything else in the digital marketing field. Google rules the kingdom and when Google makes changes everyone takes notice and adjusts strategy – or they should.

There are very few platforms on the web that can afford to thumb their noses at the search giant. Every day you are using more of their products – just think about it – Gmail, Chrome, Analytics, Webmaster Tools, Google+, Google Places, Google Maps, YouTube, Adwords, Search, Picasa, Android, Google Drive…

Why has Google changed the face of the web and why should you care?

Google’s business model is based to a large degree on selling advertising. Yes, those ads that appear in the top three positions. Those ads that run down the right side of your search results. Those “Shopping” results with images and prices that look a lot better than the plain, old text results.

The reason they are so successful at selling these ads is their complete dominance of the “organic search” space. No one can come close – Bing and Yahoo trail miserably behind and as much as people like the idea of DuckDuckGo it really is nowhere near being a viable threat.

Now comes the balancing act – Google wants to increase revenue but does not want to sacrifice a reputation for offering the most accurate answers for any given query. This is across billions of searches and a potential series of results based on billions of web pages.

As soon as Google becomes seen as purely an advertising network their value is gone. Searchers must continue to see the most relevant web pages served up or they will go elsewhere.

Organic search delivers this accuracy which is why it is trusted above the paid results. Anyone can pay for a position but organic placement must be earned. Unfortunately, anything that has the potential to be so lucrative will become a target. We all see approaches like these:

 Be number one on Google and drive thousands of visitors to your website – watch your business EXPLODE!

Unfortunately the only exploding your business will be doing with unsolicited offers like these is the “going bust” type.

To counter the large number of people who devised ways to hack through the Google algorithm there had to be a pretty substantial plan.

How did Google counter the growing wave of manipulation?

It started with the Panda Update which rolled out for the first time in February 2011. This was the first major assault on manipulative behaviour and Google has continued to deliver shockwave after shockwave to an industry that often found it easier and more efficient to use automated software or webspam to get quick results (often euphemistically called “scaling”).

Panda continued to be rolled out on a regular basis over the next year and then in April 2012 Google hit the web with the Penguin Update. This turned web marketing on its head. The major methods for artificially gaining authority for websites were now not only virtually useless but in fact had become a source of penalties. This led to a huge loss of business for many and there were many cases of collateral damage.

Quantity of inbound links had always been a big contributor to site authority. And everyone had known that the keyword focus of links (anchor text) pointing to your site “helped” search engines understand what your site content was about.

Overnight, quantity meant nothing unless the links came from high quality sources and having anchor text that favoured exact match keywords instead of brand name became a penalty magnet.

The SEO industry and many businesses reeled from the effects of this major and unannounced shift in policy.

Two new industries have sprung up as a result – Negative SEO which is the sabotage of competitors sites by pointing thousands of toxic links at them and the second new industry is “Link clean-up” – getting rid of those links and begging Google for forgiveness. (in some cases both these services are supplied by the same unscrupulous companies!)

Now that Google had made serious inroads into cleaning up spam they were ready to deliver something new and innovative.

The Knowledge Graph

Although the Knowledge Graph is a fairly new addition to search, I think most of us have become used to its features fairly quickly.

A large number of marketers cried foul but the intent of this feature did not deviate from Google’s core purpose – to deliver accurate information as quickly and effectively as possible. The complaints were about the fact that in a large number of niches people no longer had to click through to a website to get the answers they were looking for.

The answers to a query were presented without the searcher having to leave a Google product – i.e. Google Search.

There are numerous instances where analysts have found inaccuracies and this is to be expected to an extent with any new technology but for the most part the Knowledge Graph works as it should.

Take this example – a search on Pulp Fiction (appropriate, as Darren Woolley, Founder of TrinityP3 is considered the “Mr Wolf” of Marketing Communication).

Pulp Fiction

Without having to leave the page you can get answers about the cast, Director, release date, ratings, a snippet from Wikipedia, images, characters, awards.

IMDb would not have been impressed as they would have seen their traffic decrease significantly.

The Knowledge graph also kicks in when you search for known brands, products, people, celebrities, historic events, recipes, flight times, the stock market, sports, weather and much, much more.

And Google had more in store.

The keyword data (not provided) issue

The next big hint of a shift toward something completely new was the appearance and growth of something that was extremely disturbing to marketers and data analysts.

Keyword data had always been one of the most important metrics for measuring effectiveness of SEO as well as to see how well a content marketing strategy was performing. You could easily track which keywords were bringing visitors to a site and could demonstrate how certain content was effective in not only ranking but also in converting people to visitors.

(not provided) was introduced under the flimsy guise of a “privacy issue” but very few marketers believed this to be the case. Initially this segment of mysterious data was relatively small but within months it had grown to be a substantial percentage of keyword data for most sites (interestingly, technology based sites were hit the hardest – coincidence?)

This became a significant frustration and many marketers developed tactics to deal with this gap in information – many of these tactics were convoluted and completely inaccurate.

Then Google announced and enforced a 100% (not provided) rule a few weeks ago.

The reaction was pretty incredible – “Did Google just destroy the web?”, “Google – the evil empire!”, “Google hates SEO and marketers”, “Google wants to control us all”, “Google hates small business”, “Greedy Google”…

And so on…

None of which was true of course.

Finally, Google announces the new Hummingbird algorithm

Unbeknownst to the majority of marketers Google quietly replaced the majority of their algorithm in August and began testing and tweaking it.

There were no announcements and only a small number of data scientists picked up on the fluctuations in search behaviour.

The world did not end.

This change is possibly one of the most important changes to how the web works in a decade and most people did not even notice. Until they were told.

Hummingbird was officially announced at the garage where Google began business to coincide with their 15th birthday – nice touch!

And although there was a bit of confusion around whether Hummingbird was actually a new algorithm Google was relatively clear with what the intent of this new technology is.

How does Hummingbird work?

“Our algorithm had to go through some fundamental rethinking of how we are going to keep our results relevant…The main focus, is that the new algorithm allows Google to more quickly parse full questions (as opposed to parsing searches word-by-word), and to identify and rank answers to those questions from the content they’ve indexed.” – Amit Singhal

Hummingbird has shifted the emphasis in search and it has done this with one eye firmly on the future.

Traditionally (if I can use that term) search has been driven by keywords. People have learned to use shortened phrases that machines will interpret to give an accurate result. There is very little similarity between keyword based searches and natural language use.

A search like “strategic marketing management consultants” although effective in finding companies that are authoritative in this field uses language that is shortened and a bit clumsy.

What Hummingbird has begun to bring to search is natural language queries or “conversational” language accuracy.

This means that a search such as ” The top strategic marketing management consultants in Australia” (click through on this one – you might find the results interesting) will be recognised despite the target pages being optimised for specific keywords that only cover a portion of this full search query.

The encouragement of conversational search is happening for a couple of reasons. The first is the future of “Voice Search“. Although this has been around for a while now this will intensify as more mobile users begin to use this feature. This will be amplified with the uptake of Google Glass which is limited to an “Explorer” beta launch so far.

As users are not going to speak in “keywords” conversational search will have to be accurate.

The second key element for Hummingbird is a vast improvement in semantic intelligence. Disambiguation and searcher intent have become a primary focus and much of this is taken from a greater understanding of context. Language is an incredibly complex thing and there are numerous meanings for the same words based on their context.

So, Hummingbird had to be smarter at identifying user intent based on possible meanings of other words in the full context.

Additionally, if a user is on a mobile device their intent is likely to be quite different to someone on desktop.

An example of this might be a mobile user who searches for a particular product will be given local stores based on location whereas a desktop user might be given a range of ecommerce solutions that ship to their country or even a definition of the product, or other information if this is seen to be appropriate.

So, multiple factors will influence the results – geographic location, device type, previous search history, context, semantics and much more to enable a close match to what people are looking for.

Another fascinating development is the ability to recognise a series of queries. Examples are given like this one in a recent Forbes article where the previous search is taken into account in subsequent queries.

Search query strings

As you can see the second search follows how natural language would be used. Instead of the follow up being “How tall is the Empire State Building?” it is simply stated as “How tall is it?” and the algorithm looks at the previous search for context.

What does this all mean for Marketing?

I see all of these developments as being very positive. There has been so much confusion around keywords for so long and often, one of our first tasks with new SEO projects is to undo over-exuberant keyword use both in metadata and content.

Shouldn’t we be writing for search engines?


Shouldn’t we get as many keywords as possible into our metadata because that helps us rank well?


Shouldn’t our titles read Keyword 1 | Keyword 2 | Keyword 3 | Brand?

No. Not since about 2009, anyway.

Shouldn’t we use our target keyword multiple times in our content?


Should I forget about keywords then?

No. Definitely not.

Keyword research is still extremely important in ensuring your content will be found. Keyword optimisation is still important. What has changed is the moving of keywords from a tactical base to a strategic one.

And to help calm down individual keyword obsession Google has kindly given us (not provided) and has shut down the Google External Keyword Tool (and replaced it with Keyword Planner).

To be completely honest, optimising for specific phrases targeted at page content and being able to show not only international rankings but growth in numbers of visitors as a direct result was a pretty effective bragging rights tool – but who did it really benefit? The client?

Not really.

Now that we do not have to focus on tracking specific keywords what should we be doing?

More than ever before your focus must be on creating regular, high quality content.

  • Content that answers your customers’ or clients’ most important questions.
  • Content that demonstrates your expertise.
  • Content that is comprehensive and generous with information.
  • Content that engages and is sharable through social media.
  • Content that is unique and newsworthy

And when you want to track results look at growth in organic visitors, numbers of pages viewed, time on page, most popular content, unique visitors, conversions, traffic from other sources. Forget about whether Keyword #1 has gained 0.25 of a position!

How has content strategy and search engine optimisation changed post-Hummingbird?

It hasn’t really changed at all.

These are some of the elements that will continue to drive digital marketing success:

  • publish high quality content frequently
  • audit your web properties regularly for technical compliance and errors
  • focus metadata on page content, not wishlists
  • develop a broad reach through social media
  • stop ignoring Google+
  • amp up your PR efforts
  • update your design and User Experience
  • factor Conversion Rate Optimisation into your web pages
  • evaluate your Pagespeed and make improvements
  • build relationships with influencers and authorities in your niche
  • develop an effective content promotion program
  • talk to your audience in a voice that is compelling and relevant to your market
  • be innovative and creative with your efforts
  • test and learn, test and learn
  • monitor all your efforts through Analytics and Webmaster Tools
  • allocate time each day to stay up with developments and to see what thought leaders are saying
  • Understand Google Webmaster Guidelines

Nothing new here really.

I’m interested to hear what you have found so far with Hummingbird. Has it made a difference to your search experience? Have you adjusted your marketing strategy to take Hummingbird into account? Let me know by leaving a comment.