Technology & Data
- Big Data
- Tech & Innovation
How to Get Your Sales and Marketing Teams to Work in HarmonyContent Marketing for Midsized Companies: Whom to Target, What to CreateAtri Chatterjee of Act-On Software on the New Generation of MarketersMarketing Automation: What It Is and Why You Need to Know
- Social Tools
Join us September 15th in Atlanta for The Employee Advocacy Summit and learn how to unleash the power of your employees.
Post your event here and we'll share it with our community. If one of our members is featured, we'll promote as well on their profile.
- Marketplace & Webinars
The SMT Marketplace
Your resource for exclusive content and insights from Social Media Today, and opportunities to reach our community of professionals.
The Social Business Book Club brings you books, discussions, and insights from today's to business thought leaders.
Join interactive talks and and panel discussions with leading thinkers and practitioners on social media and networked business, or browse the catalogue of recorded sessions - all completely free.
Reach Social Media Today's community of marketing and communications professionals in an editor-approved context with a native advertising package.
Losing the Hacker Way Is Killing Facebook
Posted on August 9th 2013
As a data-led individual, I was encouraged to read about Facebook’s reasons for slowing the ‘hacker way’ on Wired earlier in the week. It is definitely a sign of maturity in the company that rollouts are carefully monitored, results poured over and analysed in order to iterate and ensure the best possible product for all users. This is definitely a sensible approach. However, I believe Facebook may be taking this too far. That’s right, a data analyst by trade arguing that there’s too much data being used!
The reason for this is that I believe Facebook’s slow rollout is damaging the overall product over too long a time period. Take my Facebook account, for example. At present I have Graph Search (and the modern-looking top nav), the old style NewsFeed, but the new style timeline. Now, I’m sure I’m seen as a segment in the data analysis team over at Menlo Park, providing interesting user study results. However, from my perspective as a user I’m getting a disjointed experience that leaves me confused as to the overall look and feel of the site.
Using a very un-scientific method of getting each member of the team in the office to login and show me their Facebook, it’s clear than I’m not the only one with a very jarring experience, and that there are a number of variations circulating – some don’t have Graph Search, some have the new NewsFeed but old timeline and so on.
What’s worse is that my Facebook access and many others has been in limbo like this now for months. The new NewsFeed was announced in March & Graph Search in January – we’re now 8 months down the line and rollouts are still gradually occurring. This wouldn’t be a problem if the new look wasn’t such a step-change from the old site. When you have a part of the new Facebook design, it becomes apparent how outdated the old version has become.
Compare this to Google, and you’ll see the difference in terms of speed. Google Plus fundamentally changed their design for all users overnight in May. Admittedly, Google Plus is a much less used platform right now, but their speed of development and change must be a worry to Facebook. Google are very quiet about their testing, but you’d imagine complex testing was done before, during and after launch to validate the design changes – the main difference is their ability to react and do things quickly despite their maturity and size.
The ideal launch process should be based on setting measurable tests based on a hypothesis. These tests should be pre-defined and have actionable outputs that can be implemented fast. The key to this process is to learn fast and change things. Facebook had gone too far from its ‘move fast and break things’ mantra, and needs to pull back from ultra-testing to a learning quickly and iterating approach. In principle this is a very similar approach, just slightly semantically different. What I mean by this is that if, rightly or wrongly, Facebook are leaving the hacker way behind and don’t want to risk breaking their product for the purpose of speed, they need to adapt to have a culture where they can learn and iterate their releases much, much quicker.
Segmentation is naturally a key aspect, and with over a billion users this must be a lot of fun for their data team. However, it’d be great to see a qualitative element brought in, as I’ve yet to see any evidence of large-scale Facebook survey data. Many users, like me, have requested access to the new NewsFeed for example – surely these users who are keen early-adopters are the ideal people to give the NewsFeed to and then ask questions of to get extra feedback?
It’s understandable that releases will become slightly slower from Facebook – the volume of changes are growing, and now they have large shareholder pool to look after meaning that they can’t risk affecting their ad revenues. But even with 1.15 billion users, Facebook should now have the capabilities to test and learn quickly and efficiently. A balance has to be found between studying every facet of data and providing a good user experience. The simple fact is if they fail in this, they will fail to stay relevant.
(hacker way / shutterstock)