With over 400 advertisers set to pause their Facebook ad spend from today as part of the expanding #StopHateforProfit campaign, The Social Network has made efforts on various fronts to address concerns around its policies regarding hate speech - and in particular, its decision to leave aggressive, threatening comments from US President Donald Trump up on its platform for all to see.
And while the campaign itself isn't likely to have any major, long-term revenue impacts for Zuck and Co., Facebook is very clearly concerned, with the potential for expanded effects, and lower ad revenue going forward, a real prospect as a result of the push.
Today, Facebook VP of Global Affairs and Communications - and former British MP - Nick Clegg, has published a memo outlining what Facebook is doing to address hate speech on its platform, and how it's responding to the calls to take more action.
As per Clegg:
"I want to be unambiguous: Facebook does not profit from hate. Billions of people use Facebook and Instagram because they have good experiences - they don’t want to see hateful content, our advertisers don’t want to see it, and we don’t want to see it. There is no incentive for us to do anything but remove it."
Clegg points highlights the connective power of social media, noting the role that Facebook has played in bringing people together during COVID-19, and even helping to boost the #BlackLivesMatter protests through its platforms.
"It is worth remembering that when the darkest things are happening in our society, social media gives people a means to shine a light. To show the world what is happening, to organize against hate and come together, and for millions of people around the world to show their solidarity."
But hate speech will always be present on a platform serving billions of people. Which, of course, isn't in question - the main crux of the #StopHateforProfit campaign is, essentially, this post from President Trump.
Other comments from Trump have followed, but this one, threatening violence against protesters, is what started the push. Twitter added a warning to Trump's tweet sharing the same comments - the second time in as many weeks that Twitter had taken action on a tweet from Trump (the first relating to 'rigged elections'). Facebook opted to leave it up, unchanged, despite the historical context and the clear threat implied.
On such comments from political leaders, Clegg notes that:
"When we find hateful posts on Facebook and Instagram, we take a zero tolerance approach and remove them. When content falls short of being classified as hate speech - or of our other policies aimed at preventing harm or voter suppression - we err on the side of free expression because, ultimately, the best way to counter hurtful, divisive, offensive speech, is more speech."
This echoes the line Facebook CEO Mark Zuckerberg has taken on such - as does this:
"As a former politician myself, I know that the only way to hold the powerful to account is ultimately through the ballot box. That is why we want to use our platform to empower voters to make the ultimate decision themselves, on election day."
Facebook's view is that people should be able to see what elected leaders have to say, whether they like those comments or not. You voted them in, you have a right to understand who these people are, and what they think about such issues. So it leans towards leaving all comments from elected officials up.
The principle here makes some sense, but the counter-argument is that by allowing those in power to make such comments, they empower others to take similar stances. Yes, people can, and will make judgments based on this. But what about those who agree with this? What about those who say 'well Trump said we don't have to wear a mask', despite health officials advising the opposite?
Comments made by the President, in particular, have power and can lead to immediate impacts, which is why the #StopHateforProfit group has called on Facebook to take a stronger stance against such comments.
But then again, overall, Facebook has been doing better on general hate speech. In his post, Clegg also points to a recent European Commission report which found that Facebook assessed 95.7% of hate speech reports in less than 24 hours, "faster than YouTube and Twitter". The data shows that Facebook, in general, is doing much better at detecting and removing hate speech - and just this week, the platform took action against the concerning 'boogaloo' movement in the US.
Facebook is doing more, overall, it's just not budging on its decision to leave comments from political leaders up. And at this point, maybe it can't - maybe, through digging its heels in for this long, Facebook simply won't be able to make any change to its stance. Because it's made its call - any move now would be a bad look, bullied into change.
And it seems, based on meetings with ad execs and organizers, that it has no intention of shifting.
In addition to Clegg's post, Facebook executives have also been meeting with advertisers to discuss their concerns.
As per Reuters:
"Facebook executives including Carolyn Everson, vice president of global business solutions, and Neil Potts, public policy director, held at least two meetings with advertisers on Tuesday, the eve of the planned one-month boycott."
In addition to this, Zuckerberg himself has agreed to meet with the organizers of the #StopHateforProfit campaign - while Facebook has also published a post responding to the campaign's key points of action, outlining the work that Facebook is already doing on each.
Again, while the campaign may not have significant revenue impacts, Facebook is clearly concerned. There's more to it than the immediate loss of ad dollars, there's the perceptual shift, the tarnished brand image. And the fact that there are other options out there for advertisers, where they might just see better results.
The money they don't spend on Facebook ads may well go to other platforms, giving them a chance to see how their campaigns perform via other means. That could mean that at least some of that money won't ever come back to The Social Network.
There's a lot more at stake than just the immediate impact, and Facebook knows it. Yet, even in the face of this, it's still not shifting from its original stance - though Zuckerberg did recently announce a move to provide more explanation for its decisions.
Will that be enough? We'll have to wait and see.