This week Google announced their biggest change since 2007 and considering they make around 500 changes to their algorithm each year, some which have had massive impact on their search results, this change is potentially a game changer. Google have always held the dream of being able to understand peoples intentions through their searches and to serve them results that perfectly meet their needs. If they can manage to give users the most relevant results possible for the searches they run then the customers will be happy and Google will win (even more than they are already).
Google in their official release stated:
“The Knowledge Graph enables you to search for things, people or places that Google knows about—landmarks, celebrities, cities, sports teams, buildings, geographical features, movies, celestial objects, works of art and more—and instantly get information that’s relevant to your query. This is a critical first step towards building the next generation of search, which taps into the collective intelligence of the web and understands the world a bit more like people do.”
“Google’s Knowledge Graph isn’t just rooted in public sources such as Freebase, Wikipedia and the CIA World Factbook. It’s also augmented at a much larger scale—because we’re focused on comprehensive breadth and depth. It currently contains more than 500 million objects, as well as more than 3.5 billion facts about and relationships between these different objects. And it’s tuned based on what people search for, and what we find out on the web.”
Here is the release from Google
There are 3 main parts to this update which will make a difference to how Google’s natural search results are presented:
1) Links to different sets of results based on contextual meanings for any given search term.
When your search for something you have a clear picture in your head of what it is that you want to find, of course a search engine is just a machine and has to do it’s best with the information that you give it. The clearer and more specific your search term is the easier it is for the search engine to give you the results that you are looking for. The example that Google use is the “Taj Mahal”; if you search the words “Taj Mahal” then you may be looking for a photo, the history, a travel guide, a Grammy award winning musician, a local Indian restaurant. The knowledge graph aims to cover these options by giving the searcher options from which to narrow their search to the correct interpretation of their original search.
2) There will now be topic summaries with key facts visible in the sidebar of the search engines results pages (SERPs)
Google know that when people search certain topics there are things that people regularly want to know about that topic; they are now stating that this functionality is a part of the new knowledge graph, although I am pretty certain that this has been taken into consideration for a very long time by all major search engines. Google have a massive amount of data which they can aggregate to understand the key areas of interest relating to broad topics. However, what is now different is the presentation of this information. As you can see in the example below there will be a summary of key facts shown in the sidebar to the right of the main results.
This image from Google shows the sidebar info covering the paid search ads positioning; I would be very surprised if that happens in reality; Google have consistently increased the visibility of their paid search results so to go and paint over that now seems very unlikely.
3) “Information boxes” that offer additional information in the SERPs sidebar based on popular related queries.
This feature offers information on closely related search queries; Google look at the search behaviour patterns in aggregate to find out what people generally search for next after they have run a specific search query. This can preempt your next search by giving you the result before you have even decided that you want to search for it; although I am not 100% sure that I agree on this as if the results are presented to you then they will surely influence your next click/search and what you see next so perhaps long term this could mean a dumbing down of the search results as things become more generic? Although I do believe Google are smarter and more inspirational thinkers than that so this is probably not the case.
Here is a short video from Google summarising the Google knowledge graph and what they hope it will achieve:
It is early days yet, I have not even seen a live version of this as it is only being rolled out in the United States to English language users over the coming weeks before moving over the UK soon, I guess. However, such a major change will sure effect the ability of marketers to get their content in front of users on the Google search engine.
The most concerning thing is that Google seem to be relying more on trusted major data sources such as Wikipedia as their partners and may therefore provide less content from new, fresher websites.
A lot of what I have read recently about Google’s changes including some of the patents that they have recently filed covers a lot on trust, and specifically trusted authors and their topical relationship to subjects.
I believe that the knowledge graph will only take this further meaning that in order to continue to gain traffic from the search engines there is going to be an even greater emphasis on links from trusted websites and of course; this comes down more and more to great content.
Links from off topic sites will surely continue to be devalued and discounted as they only serve to damage the quality of the search results and they are pretty easy to spot.
Therefore, nothing new really; keep creating great content and building relationships with thought leaders in your sector and you will win out in the end.
By the way; one thing that I think needs to become clearer amongst small business owners is that SEO is not a magical way to get tons of free traffic in a short space of time; traffic can grow quickly and steadily through SEO but only off the back of a combination of hard work and good ideas.
On the face of it things are becoming tougher and tougher for small publishers. it is definitely much harder for new businesses to take market share using the same kind of content that the big brands are producing. However, there are 2 things that make me more optimistic:
1) It would be crazy for any search engine to optimise towards only trusted sources of content and not to include newer smaller organisations content in the SERPs as this is completely against the point of the internet – the search engines real aim is to find the best and most appropriate content for a users search wherever that data is on the internet. Achieving this is not easy but limiting sources of data is like giving up and I cannot imagine any one search engine doing this as it is a losing strategy.
2) Social media, at least in theory, gives everyone the chance to get their content noticed – although in practice it does help to be connected to influential individuals. Smaller organisations still have the advantage that they are less risk averse and faster to move.
Search engines favour fresh content still. I think that to survive nowadays the smaller businesses must be creative, more creative than the big brands; they cannot compete on volume so it has to be in originality and quality. I believe that particularly Google are improving the search experience constantly and filtering out the crap so although they are not there yet, a business who focuses on building great content in a sustainable way will win out in the end.
However, use of content now also requires creativity to get the most value from it. It is no longer enough to just publish regular blog posts; these should be marketed hard with a focus towards existing customers. If you can improve loyalty amongst existing customers then they have the tools available to help you win new business through them sharing your content.
Key in the future will be the quality of the data held in websites; in order to show up well in search engines retailers are going to have to start using structured data such as schema.org to ensure that the websites data is commonly understood. However, it does seem that Google are now aiming to provide answers directly in the search results through using this data. This obviously means less traffic to individual websites. Pretty frustrating if you spend 2 years compiling a massive set of data on a topic and then Google just goes and scrapes it and serves it up on their site.
This is perhaps another signal to diversify sources of traffic to your website and to rely less on Google or any one source of traffic.