Facebook Clarifies Errors with Metrics - What it Means for Marketers
Earlier this week Facebook announced that it had found some of the metrics it's been providing to advertisers are inaccurate.
That's because Facebook did the same thing a couple of months back, when it found some of the numbers it had been providing on video views were also wrong.
So what should we make of this? Does this mean more of Facebook's metrics are incorrect and we just don't know it yet? Do you need to revise your strategy as a result of the clarified data.
Here's a full rundown of the cause and effect of each, including the September video update.
Average Duration of Video Viewed
In September, Facebook clarified that they'd been reporting their Average Duration of Video Viewed metric wrong.
For some time - years in fact - Facebook had been using this calculation to provide this stat:
Total time spent watching / Total number of people who had played the video for three or more seconds
This means that all those people who were watching for less than three seconds were not being counted in the result, meaning the actual average duration of viewed content was lower than what was being reported.
The impact of this is hard to define, as it's a case-by-case proposition, but some have suggested that the error increased the total duration of video viewed count by around 60%.
Upon discovery of the error, Facebook put out a statement saying that they'd fixed it and reviewed their practices to ensure it doesn't occur again.
"We informed our partners and made sure to put a notice in the product itself so that anyone who went into their dashboard could understand our error. We have also reviewed our other video metrics on the dashboard and have found that this has no impact on video numbers we have shared in the past, such as time spent watching video or the number of video views. We want our clients to know that this miscalculation has not and will not going forward have an impact on billing or how media mix models value their Facebook video investments."
Page Insights - Organic Reach
Facebook reported this week that it's uncovered a bug which meant that one of the dashboards on the Page Insights tab was displaying incorrect data.
The element in question is this one:
This is the only figure affected - Facebook says the reach on this tab was miscalculated, showing a sum of daily reach but not removing repeat visitors from the count.
As per Facebook:
"The de-duplicated 7-day summary in the overview dashboard will be 33% lower on average and 28-day will be 55% lower. This bug has been live since May; we will be fixing this in the next few weeks. It does not affect paid reach."
Page Insights - Organic Reach Change
In addition to the reach error, Facebook has also reported that they're making an improvement to Page organic reach stats to better match how they calculate the same for paid content.
"On Pages, we've historically defined reach as a person refreshing their News Feed and the post being placed in their feed. For paid ads reports, we've moved to a stricter definition that only counts reach once the post enters the person's screen ("viewable impressions")."
With this change, Facebook estimates that reported reach for Pages will be 20% lower on average - so if you see a sudden shift, this may be the cause. Facebook says they'll provide an update to all Pages via a notification within your Insights tab when this change comes into effect.
Video - Measuring Completion
As part of their efforts to improve the accuracy of their data, Facebook's been working with measurement group Moat to add an extra layer of third-party verification. Moat has been analyzing Facebook's current system, and they uncovered another reporting error related to video completion rates.
The problem occurs when the video and audio tracks on a piece of content don't match up - as noted by Facebook:
"When partners upload their videos to Facebook, the full video length is recorded, but when the video delivers to people's devices, the length of the video can sometimes be a fraction of a second shorter or longer."
Because of this, Facebook was sometimes failing to count completed video views because the audio was longer than the video element - so the video stops while there's still more audio left. Whenever this happened, Facebook didn't register this as a 100% view, which it should have.
Facebook estimates that this may result in a 35% increase in the count of "video watches to 100%".
Instant Articles - Time Spent
Facebook also found an error in their reporting of average time spent per article on Instant Articles, with a calculation mistake leading to an over-reporting of this stat by around 7%-8%.
Facebook says that they were incorrectly counting the "histogram of time spent", instead of reflecting the total time spent reading an article divided by total views.
Analytics for Apps
An additional error was uncovered in their reporting of referrals from apps.
In the Analytics for Apps dashboard, the "Referrals" tab displays all the posts produced by people via an app or website.
This data's supposed to be a reflection of clicks that went directly to an app or website, but Facebook found that they were also counting other clicks on those posts, including clicks to view photos or video.
"For power users of this metric (top apps that look at this data in the dashboard most frequently), we found that referrals have been overstated by approximately 6% on average. Other measurements of referrals, such as those appearing in Facebook's ads reporting tools, are unaffected."
Interest Lists - Follower Counts
Back in 2012, Facebook introduced 'Interest Lists' to help people follow content they might be interest in but didn't necessarily want flooding their News Feed.
This meant that you could create a list for, say, "Fiction Writers" and add publishers and authors to it, or a "Basketball" list with NBA players and official team accounts, which you could then check in on whenever you wanted.
But the feature has never really taken off. Facebook ran a test earlier this year with a replacement for interest lists, which provided topic-specific News Feeds, but we've not heard anything about it since, so it looks like that wasn't a winner either.
Because of this, Facebook has also announced that they're removing Interest Lists as an option - which is no big deal, given that you probably don't use it, but it will also result a reduction in followers for some Pages and Profiles.
This is because whenever you or your Page was included on an Interest List, that counted as another follow - but also, if someone were to be following you from their profile and via an interest list, that got counted twice, so all of those additional follower numbers will now be removed.
Facebook says that most profiles will see a drop in followers of less than 5%.
So there's quite a few things to get through - and, I guess, the bigger issue, and concern for Facebook, is that these errors now bring all of their other metrics into question. How can we rely on Facebook's data given that they've got so much wrong? How long will it be till another report comes out revealing that their figures were off?
Really, the best way to counter this is to cross-check all data with your own on-page metrics and Google Analytics stats. Most of the figures noted above will have little impact on the more common business goals - being click-throughs and subsequent conversions. You can still track all the Facebook referrals for these stats in other ways, and it's important to do so to ensure that what you're seeing on Facebook matches with the rest of your stats.
On Facebook's side, they're working to facilitate more third-party verification of their data to give marketers more peace of mind, while they're also improving the accuracy of their naming structures (e.g. "view content" → "website view of content," and "video views" → "3-second video views") and updating their glossaries to ensure there's more consistent explanations of their metrics across the various platforms.
But really, how big an impact these errors will have, and how much trust you put into Facebook's data, is up to you. As noted, it's important to cross-check where possible and ensure the data you're seeing reflects the end results.
Follow Andrew Hutchinson on Twitter