I maintain that a lot of X’s perceptual problems could be solved by it simply instituting a PR and/or communications department.
Yesterday, amid ongoing questions about its seemingly more lax moderation processes under Elon Musk, and reports which point to X hosting and amplifying misinformation around the Israel-Hamas war, X shared an update on its efforts, via Community Notes and its evolving ad placement tools.
On Community Notes, X says that “hundreds of new notes” are going live every day, matched to thousands of posts, which are being seen in the app “millions of times”.
Community Notes is X’s big hope to alleviate itself of at least some of the content moderation calls that previous Twitter management had made, with the process essentially enabling users to decide what should and should not be allowed in the app.
And while X continues to admit more Notes contributors, and is expanding the places in which Notes can be seen in the app, external analysis suggests that most notes are still never being shown at all, due to the way in which Notes are approved, or not, in order to ensure general agreement.
The Notes process also means that content generally has time to go viral before a note can be attached.
As explained by disinformation researcher Conspirador Norteno:
“Although X’s crowdsourced Community Notes fact checking system has resulted in some of the most popular false posts being labeled with additional context, it has struggled in multiple ways to address the flood of misinformation regarding the war. The sheer volume of posts has overwhelmed the pool of volunteers who write and rate notes, posts often go viral quite quickly and rack up massive engagement before notes appear, and the system is most effective at labeling individual popular posts and does not scale well to situations where a misleading claim ends up being repeated by large numbers of small accounts (organic or otherwise).”
Various researchers have found the same, that Community Notes, while an interesting, and potentially valuable complement to internal moderation, is not a replacement for such, with the system failing to stop the flow of false reports in the app.
Though X seems to be trying to turn it into a crowd-sourced moderation solution, which, again, would mean that X management would then no longer be required to make so many calls on what’s true and what’s not.
Conceptually, it makes sense. But in reality, the process, thus far at least, seemingly has a lot of holes that are unlikely to be filled.
Also, as Conspirador Norteno points out, the incentives of X’s new creator revenue share program, which prompts users to spark as many replies as possible, conflict with the platform’s efforts to restrict misinformation, in many respects.
In terms of ad exposure, X is also working to reassure advertisers that they can control their ad placement, in order to avoid unwanted association with misinformation or controversial topics, through its ad tools.
X says that brands can now add up to 4,000 negative keywords, and 4,000 accounts to avoid within their ad settings, to ensure that their ads are not shown where they don’t want them to be. Which is a lot, and I can’t imagine many brands are coming up with 4,000 exclusions. But then again, there are generic block lists available, created by ad safety partners, which could help to cover off on a lot of potentially risky terms.
X also has its new “Sensitivity Settings” to help brands control general ad placement suitability on the Home Timeline, while it also notes that brands can opt out of ad surfaces and placements which don’t support Adjacency Controls and Sensitivity Settings, like search or replies.
So there are options to help brands avoid unwanted ad placement if they’re concerned about reports of misinformation in the app. But the problem is that many brands likely won’t get this info, because of the way in which X has communicated such in the app.
X is working to promote more long-form content in the app, via longer tweets and videos, as a means to facilitate more forms of creation and engagement. Elon’s vision is that, eventually, X will be able to compete with every other app, which includes YouTube for longer video, and Substack for longer posts.
If X can host all of this, that’s how, in Elon’s view at least, it can win. But the problem right now is that long-form content is not what people are used to reading in the app.
People are used to coming to X/Twitter to catch up on the latest headlines, which you can then tap through on for more info. The fact that more content is now being jammed into X posts runs counter to how people have traditionally used the app, which reduces the focus on the content itself, and will likely see many people miss X posted updates.
Including X’s own updates, which are now coming via long-form X posts.
As advertisers navigate a paid presence on X, we wanted to remind you of the tools you have to create campaigns, with highest degree of control:— Business (@XBusiness) October 17, 2023
Adjacency Controls: create negative keyword and/or account handle lists for ads running in X’s Home Timeline. The lists prevent your…
People are simply not accustomed to reading all of that info in a single X post, which means that many are probably overlooking the meat of X’s announcements, because they’re not being formatted in a traditional announcement style.
That’s a habitual shift, and if X wants to make it happen, it needs to stick with it. But it still seems like X would be better served by having a dedicated communications department, to properly communicate and share such updates.
In any event, X is going to do things its own way, and iterate on its weak points over time.
That will see X evolve, but it could be a slow grind on some fronts, which could lead to more problems for the app as it works to evolve into its new form.
That’ll impede the app’s development, and again, many of these issues could have been clarified by simply having an official brand comms voice.