A Complete List of Facebook's Misreported Metrics and What They Mean
Late last week, Facebook was forced to once again issue a clarification regarding inaccuracies with the metrics they've been providing to advertisers, their third such update in the last four months. And while some have jumped on the news as confirmation that Facebook's data can't be trusted - and even that the platform may not be as influential as many believe - it's important to look at the detail of each to understand what the inaccuracies are and how the misreporting has actually happened.
Here's a complete list of all the reporting mistakes Facebook has identified in recent months, including an explanation of what each measure is supposed to represent, what it was showing, and how they're working to correct it
(NOTE: This post was updated on 12/16 to include the latest Instant Articles error)
1. Like, Share Buttons and Mobile Search Discrepancy
Last week, Facebook reported that they've identified a discrepancy between the counts for the Like and Share Buttons via their Graph API and the counts when you enter a URL into the search bar in the Facebook mobile app.
The issue was actually identified by Tim Peterson from Marketing Land - as explained by Peterson, as part of his investigation into BuzzFeed's report which showed that fake news outperformed real news in the last three months of the US Presidential Election campaign, he'd gone looking for a way to isolate share counts, specifically, for each article, as BuzzFeed's report had looked at the number of shares, likes and comments combined. Facebook updated their algorithm in June to give more weight to updates from friends and family, so Peterson wanted to break out the share counts specifically to see if shares were resulting in higher reach - thus, the algorithm could also be part of the problem in the distribution of fake reports.
To analyze shares specifically, Peterson compared Facebook's Graph API data for each post identified by BuzzFeed (which displays total share counts), then cross-checked each in the Facebook mobile app (which displays a measure of shares in isolation, as per the above image).
As explained by Peterson:
"For 20 fake news links and 20 real news links from BuzzFeed’s investigation — the August to Election Day groups — I pulled the total number of engagements from Facebook’s Graph API and the total number of people who shared the link from Facebook’s mobile search results. And that’s when I saw something weird. For three of those 40 links — to NBC News, Breitbart and Liberty News — Facebook said that more people had shared or commented them than had shared or liked or commented on them. And the discrepancies weren’t small. At the time I checked, the NBC News link received 407,708 shares and comments , per the mobile search results, but 237,881 shares and comments and likes, per the Graph API results."
Facebook says they don't know how this error has happened, but that they're working to fix it.
In this case, the impact of the error seems relatively small - not many, if any, businesses would be looking to make comparisons in this way, so the data displayed on the mobile app is likely not going to have had a major impact on measurements. But the concerning element is that Facebook doesn't know why this has occurred, whether it's a problem with how the Graph API counts shares or with the mobile app itself. If it's the API, there could be further impacts, though there's nothing to suggest that's the case as yet.
Going on the available info, this problem means that the data displayed in the mobile app on the number of people who've shared a link is wrong and should not be relied on till Facebook issues an update.
2. Mis-counting Facebook Live Reactions
In addition to this, Facebook also last week reported that they've been miscounting when Reactions occurred on Facebook Live videos.
As shown here, Facebook's been providing Facebook Live creators with total Reactions counts on their content, but the data they've been providing was inaccurate because they were only counting one Reaction per live viewer, as listed in the "On Post" data (middle column). Users, of course, can - and often do - register multiple Reactions during a video. These subsequent Reactions were being incorrectly listed in the "On Shares" count (third column). This means that your Reactions counts would have made it seem like there were more people reacting to replays of the broadcast than there actually were - the total counts were, and are, accurate, but Facebook was mis-allocating exactly when those responses actually occurred.
As noted by Facebook:
"The fix for this issue will apply to newly created Live videos, starting mid-December. It will increase “Reactions on Post” by 500% on average and will decrease them on “Reactions from Shares of Post” by 25% on average (actual impact to specific videos may vary)."
So more people respond more often to the live broadcast than the replay, and Facebook's data will now accurately reflect that. Important for Live creators.
3. Average Duration of Video Viewed
In September, Facebook clarified that they’d been reporting their Average Duration of Video Viewed metric wrong.
This had been going on for some time, years in fact. Facebook had been using this calculation to provide this stat:
Total time spent watching / Total number of people who had played the video for three or more seconds
This means that all those people who were watching for less than three seconds were not being counted in the results, meaning the actual average duration of viewed content was lower than what was being reported.
The impact of this is hard to define as it’s a case-by-case proposition, but some have suggested that the error increased the total duration of video viewed count by around 60%.
Upon discovery of the error, Facebook issued a statement saying that they’d fixed it and reviewed their practices to ensure it doesn’t occur again.
“We informed our partners and made sure to put a notice in the product itself so that anyone who went into their dashboard could understand our error. We have also reviewed our other video metrics on the dashboard and have found that this has no impact on video numbers we have shared in the past, such as time spent watching video or the number of video views. We want our clients to know that this miscalculation has not and will not going forward have an impact on billing or how media mix models value their Facebook video investments.”
4. Page Insights – Organic Reach
Facebook reported in November that it had uncovered a bug which meant that one of the dashboards on the Page Insights tab was displaying incorrect data.
The element in question is this one:
This was the only figure affected – Facebook says the reach on this tab was miscalculated, showing a sum of daily reach but not removing repeat visitors from the count.
As per Facebook:
“The de-duplicated 7-day summary in the overview dashboard will be 33% lower on average and 28-day will be 55% lower. This bug has been live since May; we will be fixing this in the next few weeks. It does not affect paid reach.”
5. Instant Articles – Time Spent
Also in November, Facebook found an error in their reporting of average time spent per article on Instant Articles, with a calculation mistake leading to an over-reporting of this stat by around 7%-8%.
Facebook said that they were incorrectly counting the “histogram of time spent”, instead of reflecting the total time spent reading an article divided by total views.
6. Analytics for Apps
Facebook also found an error in their reporting of referrals from apps.
In the Analytics for Apps dashboard, the “Referrals” tab displays all the posts produced by people via an app or website.
This data's supposed to be a reflection of clicks that went directly to an app or website, but Facebook found that they were also counting other clicks on those posts, including clicks to view photos or video.
“For power users of this metric (top apps that look at this data in the dashboard most frequently), we found that referrals have been overstated by approximately 6% on average. Other measurements of referrals, such as those appearing in Facebook’s ads reporting tools, are unaffected.”
7. Undercounting Instant Articles on iOS
On December 16th, Facebook confirmed that comScore had identified an error with Instant Articles data which caused an underreporting of iPhone traffic for IA content between September 20th to November 30th, 2016. According to The Wall Street Journal, the error impacted less than 1% of the traffic for affected publishers, although, comScore estimates that some publishers’ traffic was undercounted by 10% to 20% in the period.
Facebook has now corrected the error and is working with comScore to produce updated estimates for the relevant time periods for partners affected. Facebook has also contacted impacted publishers.
iPad and Android traffic were not affected.
Fixing the Problems
In announcing all these changes except the noted mobile search discrepancy, Facebook has also detailed how they're working to fix them. But obviously, the bigger issue and concern for Facebook, is that these errors now bring all of their other metrics into question - how can we rely on Facebook’s data given that they’ve got so much wrong? How long will it be till another report comes out revealing that their figures were off?
Really, the best way to counter any data errors is to cross-check everything with your own on-page metrics and Google Analytics stats. Most of the incorrect measurements will have little impact on the more common business goals – being click-throughs and subsequent conversions. And importantly, you can track all the Facebook referrals for these stats in other ways, separate from Facebook itself, and it’s important to do so to ensure that what you’re seeing matches with the rest of your stats.
On Facebook’s side, they’re working to facilitate more third-party verification of their data to give marketers more peace of mind, while they’re also improving the accuracy of their naming structures (e.g. “view content” → “website view of content,” and “video views” → “3-second video views”) and updating their glossaries to ensure there’s more consistent explanations of their metrics across the various platforms.
But really, how big an impact these errors will have, and how much trust you put into Facebook’s data, is up to you.
As noted, the mistakes underline how important it is to cross-check where possible and to ensure the data being presented truly reflects the end results you experience.
Webinars On Demand
January 25, 2017While we’ve had access to the Internet since the early 1990s, we’ve only started to experience the full effects of its disruption on public rela...
December 07, 2016It's finally happened, social media has grown up and sold out. And it's awesome. For digital marketers, the maturing advertising options on soci...
Video is expected to account for three-quarters of all mobile traffic by 2020. But, creating powerful, effective video requires a significa...
Marketers are constantly seeking to engage with their buyers and drive actions that help buyers move rapidly through their customer lifecycl...