- Content Marketing
When Your Customers Become Your Contributors: Brand Journalism Meets TraditionalGoogle Is Changing the Close Variant Matching Option in AdWordsBefore You Invest in Online Advertising, Do This!Native Advertising: The New New Thing or a Race to the Bottom? [VIDEO]
Technology & Data
Data and Creativity at the Social Shake Up: Defining Your Data-Driven Social CampaignTalking Strategy and Data with Shannon Lee of Precision StrategiesNew IBM Study Reveals 3 Key Characteristics of the Most Successful CompaniesMinority Report: Confronting Privacy Issues in Big Data Gathering
- Tech & Innovation
- marketing automation
Social Startups: Moment.me Captures a 360-Degree View of The Social Shake-Up 2014Hootsuite Partners With Syracuse University to Bring Social Media Savvy to College StudentsThe Best Hyperlapse VideosThe Best Content Moderation Tools for Busy People Who Don't Have Time for That
Social Change Agent Survey: Passion, Skill Set, and Persistence Lead to Career GrowthThe Social Shake-Up Attracts Wide Breadth of Brands and IndustriesThe Social Shake-Up: How CMOs Drive Innovation and Revenue GrowthThe Social Shake-Up: The Future of Social Business
- Small Business
- Social Organization
Recap from the First-Ever Employee Advocacy SummitFormer IBM Senior Advisors Launch Brands Rising to Build Employee Advocacy ProgramsPerformance and Risk Management Through Social Media TrainingEmployee Advocacy Summit: Advocate Stories from the Field
- Customer Service
Join us September 15th in Atlanta for The Employee Advocacy Summit and learn how to unleash the power of your employees.
Post your event here and we'll share it with our community. If one of our members is featured, we'll promote as well on their profile.
- Marketplace & Webinars
The SMT Marketplace
Your resource for exclusive content and insights from Social Media Today, and opportunities to reach our community of professionals.
The Social Business Book Club brings you books, discussions, and insights from today's to business thought leaders.
Join interactive talks and and panel discussions with leading thinkers and practitioners on social media and networked business, or browse the catalogue of recorded sessions - all completely free.
Reach Social Media Today's community of marketing and communications professionals in an editor-approved context with a native advertising package.
Issues Google Needs To Address in 2014
Posted on December 16th 2013
All is not well in the world of Google.
In the post-Hummingbird landscape of SEO, monitor any message board and the Google related sentiment is generally negative. Perhaps this isn’t surprising given the fact that SEO black-hatters, slow or reluctant to change their ways, have been hit hard by the latest changes. However, for many ethical SEO’s who were already using sustainable, Google-pleasing techniques, there is still a lot of negativity cast in the search giants direction.
Claims of anti competitive and unethical practices are on the rise, and webmasters are increasingly vocal in condemning Google’s often nonsensical policies. Let’s take a look at their stance on ‘bad’ links for example. Google remains wooly, but generally describes bad links as being those from Low-quality directory or bookmark site links, text advertisements that pass PageRank and advertorials. These guidelines aren’t as clear as they might initially seem. How do you define a ‘low quality’ directory? What’s more, how does Google define a low-quality directory?
Read more about low quality links and Link Schemes here.
Directories such as the long standing, human edited Dmoz are a pretty safe bet, but what about all the others? Currently, all webmasters can do is make an educated guess, looking at factors such as relevancy and Domain Authority, but it’s not always easy to know which directories and websites are on the Google safe list and which are on their black list. It would of course be a huge help if Google gave clearer information, but it doesn’t.
Disavow Can Harm Your Sites Performance
This has lead to webmasters using the disavow tool on vast swathes of backlinks, in an attempt to limit damage already caused or to reduce their chances of picking up a future penalty. However, Disavow a large number of backlinks, some of which may be actively helping your website rank well, and this could be harmful, for example, you may see your domain authority fall.
Google’s own Eric Kuan warns “The disavow backlinks tool should be used with caution, since it can potentially harm your site’s performance.” At the same time, Google’s Matt Cutts suggests that aggressive use of the disavow tool is the way to go in his recent video, How can a site recover from a period of spamming links?
Google has effectively created an environment where it warns against having too many low quality back-links and suggests using the disavow tool whilst at the same time warning against using that same tool to clear them up. In both cases, the overall result is that the average website is more likely to rank poorly, leading to decreased visibility. So why does Google choose to do this? Simple, loss in organic rank makes it more necessary for website owners to turn to paid search, so Google benefits from an increase in AdWords revenue.
Of course Google could choose to simply ignore these bad links, (as it historically did). Alternatively, it could let webmasters know which links to remove by providing a notification and a grace period in which action would need to be taken. Instead, without any warning, Google chooses to hand out often drastic penalties, leading in some cases to a devastating drop in rank or exclusion all together from search results. Recovering from a penalty is often a long and drawn out process (i’m talking several months here, not days), as webmasters flounder around, trying to manually identify and disavow ‘bad’ links and remove any potential over-optimisation that might have historically been done on the website.
Now, almost every business i’ve worked with, has used a number of different SEO companies over the years. Businesses pay for the services of an SEO expert and put their trust in them, in the implicit belief that the techniques being used will have a positive, rather than a negative effect.
The simple truth, is that Google is now penalising thousands of businesses who have done nothing more than unwittingly employ a black-hat SEO company at some time in their past. I don’t disagree that Google needs to tackle webspam and I believe the recent algorithm change has proved successful in doing this up to a point, however, the simple fact remains, thousands of websites owners are suffering due to harsh penalties, and are being forced down the paid search route as a direct result.
Read more about Google’s quality guidelines here.
Sure, there are always going to be those who try and game the system, but I don’t believe for one second that every single website that’s picked up a penalty or suffered a drop in rank, has an evil black-hatter at the helm, hell bent on playing the system. I would say that these sorts of websites would be in the minority.
Some of The Issues Google Needs to Address in 2014
1.More information and better support for webmasters
Better support is needed for webmasters struggling to recover from penalties. Whilst Google does report on manual actions within Google Webmaster Tools, often, penalties are picked up with no such manual action being reported. This makes it extremely difficult to figure out what a website is being penalised for, making recovery lengthy and ultimately, more unlikely. Webmasters need better and more detailed information to help them recover and a clean-up grace period needs to be introduced.
2. Credit to original content sources
Google also needs to be better able to identify (and penalise) low quality content aggregation websites, that simply scrape other blogs for content. I’m fed up and I know other bloggers are too, of seeing content from my blog published on other sites i’ve not authorised to share my content, with no rel=”canonical” link element. A website like Yahoo Business has far more traffic that my own blog as well as thousands more back-links, better domain Authority and so on. The chances of my blog being credited as the source is unlikely, making content scraping sites more likely to obtain the benefits.
3. Full Implementation of publisher markup
I also hope that we see Google publisher markup working effectively and consistently, as this is still currently very hit or miss in my experience.
4. Alternatives to local search for internet only small businesses
Finally, i’d like to see Google start supporting SME’s with more innovation in the area of local search. Worryingly, there is an increasing gap being created when it comes to search visibility. On one end of the scale you have the big brands with plenty of cash who are able to maintain prominence in search via AdWords. At the other end you have local businesses, being serviced relatively well by Google Local listings. Somewhere in the middle though, we have a plethora of small internet only businesses who are struggling to maintain visibility in organic search, who can’t afford extensive P.P.C. campaigns and for whom local search is irrelevant.
How do you feel about the recent changes that Google has made? Have you been affected by a penalty or perhaps you’ve recovered; what issues do you see facing Google over the coming year?
Google: The Link Disavow Tool Can Harm Your Sites Performance
What is Duplicate Content
The Benefits of Rel=Publisher
The post Issues Google Needs To Address in 2014 appeared first on Marketing & Social Media Blog | Marketing insights, tips and advice.