Issues Google Needs To Address in 2014

ubersocialmedia
Shell Robshaw-Bryan Marketing Consultant, Surefire Media

Posted on December 16th 2013

Issues Google Needs To Address in 2014

All is not well in the world of Google.

In the post-Hummingbird landscape of SEO, monitor any message board and the Google related sentiment is generally negative. Perhaps this isn’t surprising given the fact that SEO black-hatters, slow or reluctant to change their ways, have been hit hard by the latest changes. However, for many ethical SEO’s who were already using sustainable, Google-pleasing techniques, there is still a lot of negativity cast in the search giants direction.

google_seo_issues3

Claims of anti competitive and unethical practices are on the rise, and webmasters are increasingly vocal in condemning Google’s often nonsensical policies. Let’s take a look at their stance on ‘bad’ links for example. Google remains wooly, but generally describes bad links as being those from Low-quality directory or bookmark site links, text advertisements that pass PageRank and advertorials. These guidelines aren’t as clear as they might initially seem. How do you define a ‘low quality’ directory? What’s more, how does Google define a low-quality directory?

Read more about low quality links and Link Schemes here.

Directories such as the long standing, human edited Dmoz are a pretty safe bet, but what about all the others? Currently, all webmasters can do is make an educated guess, looking at factors such as relevancy and Domain Authority, but it’s not always easy to know which directories and websites are on the Google safe list and which are on their black list. It would of course be a huge help if Google gave clearer information, but it doesn’t.

Disavow Can Harm Your Sites Performance

This has lead to webmasters using the disavow tool on vast swathes of backlinks, in an attempt to limit damage already caused or to reduce their chances of picking up a future penalty.  However, Disavow a large number of backlinks, some of which may be actively helping your website rank well, and this could be harmful, for example, you may see your domain authority fall.

Google’s own Eric Kuan warns “The disavow backlinks tool should be used with caution, since it can potentially harm your site’s performance.” At the same time, Google’s Matt Cutts suggests that aggressive use of the disavow tool is the way to go in his recent video, How can a site recover from a period of spamming links?

Google has effectively created an environment where it warns against having too many low quality back-links and suggests using the disavow tool whilst at the same time warning against using that same tool to clear them up. In both cases, the overall result is that the average website is more likely to rank poorly, leading to decreased visibility. So why does Google choose to do this? Simple, loss in organic rank makes it more necessary for website owners to turn to paid search, so Google benefits from an increase in AdWords revenue.

Tackling Webspam

Of course Google could choose to simply ignore these bad links, (as it historically did). Alternatively, it could let webmasters know which links to remove by providing a notification and a grace period in which action would need to be taken. Instead, without any warning, Google chooses to hand out often drastic penalties, leading in some cases to a devastating drop in rank or exclusion all together from search results. Recovering from a penalty is often a long and drawn out process (i’m talking several months here, not days), as webmasters flounder around, trying to manually identify and disavow ‘bad’ links and remove any potential over-optimisation that might have historically been done on the website.

Now, almost every business i’ve worked with, has used a number of different SEO companies over the years. Businesses pay for the services of an SEO expert and put their trust in them, in the implicit belief that the techniques being used will have a positive, rather than a negative effect.

The simple truth, is that Google is now penalising thousands of businesses who have done nothing more than unwittingly employ a black-hat SEO company at some time in their past. I don’t disagree that Google needs to tackle webspam and I believe the recent algorithm change has proved successful in doing this up to a point, however, the simple fact remains, thousands of websites owners are suffering due to harsh penalties, and are being forced down the paid search route as a direct result.

Read more about Google’s quality guidelines here.

Sure, there are always going to be those who try and game the system, but I don’t believe for one second that every single website that’s picked up a penalty or suffered a drop in rank, has an evil black-hatter at the helm, hell bent on playing the system. I would say that these sorts of websites would be in the minority.

Some of The Issues Google Needs to Address in 2014

1.More information and better support for webmasters

Better support is needed for webmasters struggling to recover from penalties. Whilst Google does report on manual actions within Google Webmaster Tools, often, penalties are picked up with no such manual action being reported. This makes it extremely difficult to figure out what a website is being penalised for, making recovery lengthy and ultimately, more unlikely. Webmasters need better and more detailed information to help them recover and a clean-up grace period needs to be introduced.

2. Credit to original content sources

Google also needs to be better able to identify (and penalise) low quality content aggregation websites, that simply scrape other blogs for content. I’m fed up and I know other bloggers are too, of seeing content from my blog published on other sites i’ve not authorised to share my content, with no rel=”canonical” link element. A website like Yahoo Business has far more traffic that my own blog as well as thousands more back-links, better domain Authority and so on. The chances of my blog being credited as the source is unlikely, making content scraping sites more likely to obtain the benefits.

3. Full Implementation of publisher markup

I also hope that we see Google publisher markup working effectively and consistently, as this is still currently very hit or miss in my experience.

4. Alternatives to local search for internet only small businesses

Finally, i’d like to see Google start supporting SME’s with more innovation in the area of local search.  Worryingly, there is an increasing gap being created when it comes to search visibility. On one end of the scale you have the big brands with plenty of cash who are able to maintain prominence in search via AdWords. At the other end you have local businesses, being serviced relatively well by Google Local listings.  Somewhere in the middle though, we have a plethora of small internet only businesses who are struggling to maintain visibility in organic search, who can’t afford extensive P.P.C. campaigns and for whom local search is irrelevant.

How do you feel about the recent changes that Google has made? Have you been affected by a penalty or perhaps you’ve recovered; what issues do you see facing Google over the coming year?

Resources

Google: The Link Disavow Tool Can Harm Your Sites Performance
http://www.seroundtable.com/google-disavow-tool-harm-17327.html

What is Duplicate Content
http://moz.com/learn/seo/duplicate-content

The Benefits of Rel=Publisher
http://www.advancessg.com/googles-relpublisher-tag-is-for-all-business-and-brand-websites-not-just-publishers

The post Issues Google Needs To Address in 2014 appeared first on Marketing & Social Media Blog | Marketing insights, tips and advice.

ubersocialmedia

Shell Robshaw-Bryan

Marketing Consultant, Surefire Media

is a marketing consultant who works for the Cheshire based digital agency Surefire Media, where she specialises in organic search, content strategy and social media engagement. Shell has extensive experience in consumer retail brand marketing, SEO, blogging and content strategy.

 

......................................................................................................................................................................................................

As well as writing for her own blogs Camping With Style and Uber Marketing Shell also writes content for a wide number of client blogs. Shell is also a keen snowboarder, whose other hobbies include travel, camping, music and photography.

See Full Profile >

Comments

David Amerland
Posted on December 16th 2013 at 9:59AM

Shell, that is a great piece on Google's need to change some of its practices going forward but I would question a couple of your points here, or at least, the assumptions behind them: for instance the Google guidelines on link schemes are pretty detailed and very clear and Google has, over the last 24 months pumped out an incredible amount of content in blog posts on their search blog and videos both on the Webmaster Tools Channel and via Matt Cutts on webspam that in their totality spell out that unless a link provides value to the end user and comes from an authority website it is questionable why it is there, and yes, that would include most Directories. 

You also say that the majority of webmasters are innocent in their linking practices and unaware of the behaviour of the SEOs who hire them. The Inland Revenue (to digress a little and give a contextual example) says that ultimately you are responsible for your taxes, not your accountant, which means that for a business person to take the route that they did not know what someone they hired was doing on their behalf is a defence that has no legs, anywhere. 

Over the last fifteen years I have seen, time and again, that the moment webmasters become aware that something works in search to boost rankings they instantly implement it regardless. 

I am not 100% about the issue with publisher markup. It is easy enough to implement using the rel=publisher tag but jut like with Authroship Google has said that unless a website is of high enough quality, implementation of the markup is unlikely to help. 

Finally local search (the bulk of which is on mobile) is fully semantic for US-based users and increasingly for UK ones. A local business that is best suited to surface in response to a search query will do so regardless of size or amount of money spent on SEO, provided of course, it has taken the steps necessary to be indexed properly. 

 

ubersocialmedia
Posted on December 16th 2013 at 10:35AM

Thanks for your comments David. It's been really interesting to read your thoughts.

I agree that Google has put a lot of information out there regarding bad links, but there is still no definitive way of knowing which directories are good and which are bad, leading webmasters to be over-cautious, resulting in them taking actions which actually end up being more damaging as they remove all but a handful of backlinks.

Regardless of all of the information Google has published on the matter, there are still grey areas, especially for businesses who, as you point out, are ultimately accountable for what their SEO does. I guess i'm coming at it from the point of view of a small business owner who has a website and has employed someone to do their SEO. I think it would be difficult to make a judgement call as to what constitutes a good or bad backlink. Without knowing this, it makes it difficult to know if your SEO is doing a good job or not. Of course, most spammy, low authority websites and directories are very easily spotted, even to the untrained eye, but it's the directories that are borderline that I am talking about.

From conversations with other SEO professionals, i'd say the vast majority are ethical, though i'm not naive enough to believe that all SEOs are, I certainty don't believe the majority are gaming the system.

I think some SEO's can indeed be quick to jump on the next new thing that works because if they don't their competitors will gain visibility whilst they drop down the ranks, leading to a bit of a catch22 situation.

My issue with publisher mark-up is exactly as you point out. It's only useful for 'quality' brands currently, who are the only ones benefiting from it - leaving little incentive or reason for any small business website owner to implement it - this only serves to exacerbate the divide between the large amounts of visibility that big brands benefit from and the more limited exposure that smaller or less well known brands get.

With regards to local search, I don't dispute that a local businesses will surface in response to a relevant search query regardless of their size or marketing spend. I was pointing out that there are plenty of websites that aren't tied to a geographical location, who are internet only retailers, in which case local search is of no real use to them.

As a customer if I want to buy something online, as long as the company I buy from is in the UK or ships to this country, I couldn't care less what part of the UK they are in, and don't want my results to be limited or influenced by location. If i'm looking for a plumber however, then of course, I want geographically relevant results showing.

Unless i'm missing something vital here, I don't see how local search benefits internet-only businesses, for whom location is completely irrelevant.

David Amerland
Posted on December 16th 2013 at 1:06PM

Shell thank you for the quick reply. I know you're not being obtuse when yoiu say it's difficult to decide which Directory is a 'good; one to obtain a link from. I think the question should be as to why a Directory listing that is open to everyone and anyone would constitute a good place to obtain links from. For the record Directory listing was one of the most abused backlink building techniques until Google started rolling out the Panda update in early 2011, in preparation for semantic search, cleaning out its index from aritifically inflated PR sites and Directories whose sole purpose was to provide links.  

Everyone games the system. It's human nature. The difference lies in what degree that happens and what risks are being taken. Semantic search makes that way harder which is why so many SEOs are complaining about it. 

I did not point out that publisher mark-up is useful for quality brands. I said quality websites, that's irrespective of size provided the criteria for quality are met. 

Finally you totally misunderstood my comment on local search. Local search is fully semantic. It therefore takes into account your location (for location-based searches), your search history, your social connections, your search patterns, answers to similar queries, your search query, the time of day (where relevant), whether you are in a car or on foot (where appropriate) to provide the best answer possible (it wil not  restrict it geographically unless not doing so will degrade the results as per your plumber example).

As you're no doubt aware semantic seatrch indices need to be built around each language. Voice search started out in the US and is being rolled out across the world, hence my reference to US and UK audiences. I did not, at any point, mention that it is for internet only business. If anything I specifically said that "A local business that is best suited to surface in response to a search query will do so regardless of size or amount of money spent on SEO, provided of course, it has taken the steps necessary to be indexed properly." Local implies a physical location in a locale. 

ubersocialmedia
Posted on December 16th 2013 at 3:23PM

Many thanks for clarifying David.

LisaAdams08
Posted on December 16th 2013 at 12:59PM

Hi Shell, I like your article, and was wondering if Google's Hummingbird release is something you feel should be addressed for those of us in more of a Marketing role (I'm nowhere near as informed as David on tech. aspects). When company execs are used to seeing keyword information in our web reports and it's no longer available it's difficult. I'd like some guidance on how we can fill the gap in reporting. Can you or anyone else reading this advise? Thanks for any guidance.

ubersocialmedia
Posted on December 16th 2013 at 3:30PM

Hi Lisa, i'm in a similar position to you, whilst SEO is part of my role it's not my sole focus, so there are always a lot of plates spinning at any one time and many interplaying factors that need considering. With regards to keywords however, if you use Google Webmaster Tools there is some keyword data in there which may help you.

Depending on your budget, there are also alternative analytics services you can use, I tend to hear good things about Moz, but haven't used it myself.

I think anyone involved in marketing and content creation roles should be aware of Hummingbird, and the role that semantic search in particular now plays, with the focus on more naturalist search terms rather than individual keywords. I think that as long as you now your market and understand your customers well, the loss of keyword data doesn't need to be disasterous but Webmaster Tools should help.

David Amerland
Posted on December 16th 2013 at 7:00PM

Lisa it's by far the most commoin question asked these days. I covered Hummingbird's implications here. Webmaster Tools still report keywords (for the last 90 days only) but you should be slowly weening yoru customers away from them anyway as keyword rankings makes less and less sense as Google's SERPs become highly personalized. What you should be looking at are conversions and a decline (or rise) completed calls to action. This also means that landing pages need to be re-examined. There is a free analytics tool that carries out semantic analysis of social media reach and sentiment (http://goo.gl/vaxUbN) that my help a little.  

akronsound
Posted on December 16th 2013 at 7:04PM

" I don’t disagree that Google needs to tackle webspam and I believe the recent algorithm change has proved successful in doing this up to a point".

can you elaborate on that please? How exactly the algorith change has proven successful when Matt Cutts tweeted and sent manual penalty notifications to 2 of the biggest black hat networks this year?

Manual notification means that penalties are NOT algoritmic, rather manual inspections by the web spam teams who purchased links on those networks, tracked the footprint and the web sites linking to them and then penalised them. Is google that big that Mr.Cutts is so proud to tweet about Anglo Rank and Ghost rank networks going down? Which how did he discover? By having profiles in the forum and purchase services. Great algoritmh indeed.

I will say this and you can choose to believe it or not. Google is not working the way you want to think it does. Nobody can fix Google algorithm not even Google itself unless they decide to remove links entirely from their ranking factors.  The past already shows, that their algorithm gets a lot of things wrong and penalized or downrank sites that have good content while on the other hand rewarding sites that do not deserve it at all or have a spam link profile. I have given some good examples on an older blog post about 5 unusual reasons your site might not rank as high as you deserve. Blog network and high PR comment links are very powerful and as long as your competitors are using them you will never outrank them or even get the rankings your site and content might deserve.

And before I get bashed with "these tactics are sort terms" let me remind you that blog commenting got banned in 2006 yet again sites rank on page one for months with it, anchor texts carry on the last 5 years, exact match domains are still dominating and link networks are 90% responsible for your rankings since 2002. yes 2002! Sort term huh?

Other than that great post, but it is nice to see the twi sides of the coin than following up Google's PR videos and press releases. 

ubersocialmedia
Posted on December 17th 2013 at 3:03AM

Hi Yiannis. Google needs to tackle webspam, I agree with this.

Google tells us about various different websites that it has recently excluded from its listings all together. If identifying these kind of low quality,  spammy sites has nothing to do with Hummingbird or any of it's other algorithms then that's my mistake. However, whilst manual actions may be more severe, there are plenty of penalties that absolutely are algorithmically generated, that can have just as devastating an effect to small business/website owners.

I don't agree with all of these sweeping penalties however, hence why in my article I state that I believe Google's current webspam actions are working only to a point.

I hope that clarifies the thinking behind my statemnt that you highlighted.

Katherine Tattersfield
Posted on December 16th 2013 at 7:12PM

I just want to say thank you Shell for posting this. I have been dealing with these issues for a while now, and I completely agree that small business owners shouldn't fall victim to these sweeping penalties. It's very hard to figure out what's wrong on a site and how to fix it. Since Google deemed themselves the arbiters, they need to step up and offer real support. Notice how they provide extensive support for agency accounts in AdWords *whistles*

 

Another thing I'd like to see Google change in 2014---PageRank. Talk about conflicting information. One minute we should all focus on quality content, the next minute they update PageRank, which is based solely on backlinks. Nothing perpetuates web spam and link selling more than this meaningless metric. If they really care about quality, they would eliminate PageRank as many speculated they would do earlier this year. 

ubersocialmedia
Posted on December 17th 2013 at 3:08AM

Thanks Katherine.  I definitely think smaller businesses suffer more when it comes to penalties, especially those who aren't able to plug the gap by using paid search in the way big brands can much more easily do. More clarity on the role of Page Rank would be very useful too, agreed. 

Let's hope the new year see's more changes, just hopefully not quite as drastic  as all those we've see this year! 

jrconsultancy
Posted on December 24th 2013 at 10:46AM

I would echo Katherines thoughts, it is really tough for small businesses once they get a penalty, I have through my own experiences felt the wrath of Google back in 2004. All I can say is that Google is on the right track, back in 2004 you didn't even get informed you have received a penalty, at least now you get a notification in WMT, there has been more and more progress with this in the last 2 years, with Google giving specific advice in there and more so specific information about the type of penalty. For example a backlink penalty now on most occasions comes with examples of backlinks that go against the guidelines.

 

Let’s hope that 2014 sees yet more transparency from Google about their guidelines and reasons for slapping a penalty on sites. But what I really hope for is a fair playground, because currently this is still not the case, with many sites still getting away with doorway pages and multi sites completely against guidelines. What is not fair is slapping penalties on some but not all, the rules have to be for everyone, and there are still many examples of sites getting away with all sorts of techniques.

 

Regarding PageRank, whilst there is still conflicting information about it, I prefer to use the Moz tool to check authority.

Barbara Mckinney
Posted on December 16th 2013 at 9:46PM

Google made a lot of changes this year that affects marketers in any industry. I think they have to slow down a bit for marketers to understand what they really trying to do for the betterment of their company and their clients.

ubersocialmedia
Posted on December 17th 2013 at 3:20AM

Hi Barbara. I agree, we need more time for the dust to settle now I think yes.

Nathans Natural Suppliment and Nathans Natural Health
Posted on December 17th 2013 at 12:22AM

Google is looking to change some tools that give a new challenge to SEO companies.

Thanks

Nathans Natural

Byron Hardie
Posted on December 23rd 2013 at 10:09AM

Thanks for the article. I believe that many companies and business owners need to take a more active role in their Digital Marketing strategies rather than passing off all of the blame to unscruputous SEO companies. In many ways the SEO company and the Business Owners enable each other and perpetuate these risky techniques.

Business Owners typically demand near immediate results and want to dominate their competition even if their competitors have been in business for 10+ years longer, have tens of thousands of more links, and much more content. Not only do the expectations of many business owners not align with the current reality of the modern search era but they also want these services at bargain basement prices (and in some cases they want it for FREE thinking they should only pay when "results" are achieved)

How do these unrealistic expectations affect SEO companies?

It forces many SEO experts in the industry to skate closer to the black-hat line in order to meet the client's expectations (or risk turning away business), use tactics that are much more risky, or outsource much of the work to cut costs while also degrading the quality.

Some techniques that the SEO specialist deems "gray-hat" at the time and relatively low risk end up getting hammered years later when Google's technology finally catches up. The fact is that it never was Grey Hat to begin with --- Grey Hat SEO merely represents Black Hat techniques that Google isn't detecting or penalizing... yet.

Even those companies that try to educate potential prospects on the evolution of search are often met with resistance. Most business owners do NOT believe you. They are quick to reply that there are all of these other companies that claim they can rank them on the first page in 30 days, 60 days, etc.

If they stick with you, there is a very short leash and incredible pressure to deliver results QUICKLY or they will cancel. This drives many SEO specialists to push Google's limits.

If they don't go with your company they will waste money and get burned over and over again. You might think this would teach them a lesson and enforce your marketing philosophy as you warned them from the beginning. Amazingly that typically is NOT the lesson that is learned. Instead they blame the industry as a whole and conclude that all SEOs are scam artists, all the while searching for the next low-quality marketing company that can promise them the world and not deliver. They want the silver-bullet and believe that there is SOMEONE out there that can waive a magic wand or sprikle some SEO pixie dust that will give them a short-cut to the top. And they think this becuase for YEARS such techniqes did actually exist.

Why Does This Cycle Perpetuate?

It is because no matter if you are using black-hat or what you perceive to be "gray-hat" tactics the INTENT is the same: you are trying to trick Google into thinking a site has more value than it actually has.

Digital Marketing isn't a campaign with a start and end date where you write a check and wait for the organic traffic to come rolling in. It is a methodology of how you will interact with your target audience FOREVER.

Many business owners do not have the resources to adequately compete in the markets they want to target. They cobble together just enough to get the ball rolling with an SEO company as they cross their fingers that it will work out knowing full well they only have enough funds for a 30-60 day runway.

The agency does what they can with the budget they have (even assuming they are an honest and skilled company to begin with). When the client doesn't get the unrealistic results they were expecting they often cancel. Most small businesses do not consider Digital Marketing as part of their cost of doing business like having a sign out front or ordering business cards. They never actually have a budget for it and if they do it is not a budget that can effectively compete in their industry (even locally). They could save money by doing some of the work themselves but they often don't have the time or desire to do this either. It is hard enough just to get a two paragraph description of their business from them.

MASSIVE CHANGES to the search industry

With Google's recent updates in the last few years one thing is clear. Those that have whole-heartedly embraced their digital strategy in a way that builds credibility, engages their audience, and builds lasting value in their brand will be rewarded and for the most part will be insulated from most future penalties.

Those that continuously try to take short-cuts no matter how they try to justify it, no matter how cheap it might be, no matter how effective the short-term tactic may be, will never get ahead because the intent is always to pull levers to manipulate search signals.

There are definitely many shady SEOs out there but for the most part they are meeting the demand in the marketplace that business owners have created. It is time for a serious reality check. It is time for business owners to understand how search engines and digital marketing works and demand that kind of excellence from their marketing providers. They must also realise that it takes time for that natural domain authority and brand recognition to build and with additional time comes additional investment.

galoretech
Posted on March 6th 2014 at 12:58PM

Altogether 100% authentication is required for any business. 2013 was the year of Google and SEO industry. Lots of big changes have done by Google and other search engines. As a SEO firm, we also updated our SEO techniques to serve customers at http://www.galoretech.com