Twitter has announced that it’s expanding its test of downvotes on tweet replies, with more users on both iOS and Android set to see the new down arrow on tweet responses, providing another way to signal your thoughts on each comment.
As explained by Twitter:
“We learned a lot about the types of replies you don't find relevant and we're expanding this test - more of you on web and soon iOS and Android will have the option to use reply downvoting. Downvotes aren’t public, but they'll help inform us of the content people want to see.”
“We're hoping to better understand what people believe are relevant replies, and how that matches up to what Twitter suggests as most the relevant replies under a Tweet."
So the idea is that these insights will help Twitter improve its algorithms to display the best replies under each tweet, these aren’t a measure of your personal response to a stated opinion nor are they designed to be a way to bury bad tweets, as they are on Reddit.
But in effect, that’s both how they work and what they’ll be used for – though Twitter has also provided some insight into the initial learnings that it’s gleaned from the first months of the test:
“A majority of our users shared that the reason they clicked the down arrow was either because the reply was perceived as offensive, or because they perceived it as not relevant, or both. This experiment also revealed that downvoting is the most frequently used way for people to flag content they don't want to see. Finally, people who have tested downvoting agree it improves the quality of conversations on Twitter.”
In theory, Twitter’s downvote system is designed to push down spam and junk comments, but in reality, people are downvoting things that they find personally offensive and/or things they don’t want to see. Which could, of course, be seen as a form of censorship, especially if more people look to use the option to blitz dissenting opinions.
But maybe that’s what users want – the very act of using the downvote button on any reply is an intentional signal that you don’t like or want to see that type of content on the platform, in your own personal experience. The concept is not designed to moderate replies, as such, as dislike counts are not public, but it may end up serving as a way for Twitter to learn exactly the types of things that people hate. Which could improve the general quality of conversation in the app, but again, may be viewed as censorship.
Eventually, I can imagine some will see this as Twitter’s own form of ‘shadowbanning’, with their replies getting less exposure and engagement as a result of downvotes. But then again, it would also be very difficult for the tweet author to see how the process has impacted their comment, as they wouldn’t necessarily be able to view how it’s ranked in the overall reply chain.
The downvotes also aren’t on general tweets, just replies, which does reduce the overall impact. But Twitter could still look to incorporate what it learns from the test for its overall tweet ranking algorithm, which could impact general tweets also.
It’s an interesting test, and it is interesting to consider the Reddit model as a template of sorts for this, with Twitter even coloring its downvote button in Reddit orange.
Essentially, it serves a similar purpose to Reddit downvotes, in surfacing the most engaging, most interesting discussion points within each thread to boost user engagement. And maybe, if it is successful, Twitter could consider expanding downvotes to all tweets.
That would be a very interesting development. Twitter hasn’t suggested that it might be even considering this. But it could be the logical progression to boost user engagement.
That would be a game-changer for the platform, and the court of public opinion.