It’s disappointing, but one thing that you can always be certain of with any socially-aligned technology is that some people are going to use it to harass and abuse others, in any way that they can.
Most recently, that’s come up in virtual reality, with various incidents of women being attacked in Meta’s evolving VR world, in exceedingly concerning ways.
Back in December, The Verge reported that a beta tester for Meta’s Horizon Worlds functionality, which is its social media replacement in VR, was groped by a stranger within the digital realm. Then earlier this month, a woman said that she had been “virtually gang-raped” in the VR environment.
These are obviously major problems, especially as Meta looks to make a bigger shift towards VR as part of its metaverse development. Which is why today, again disappointingly, Meta has been forced to implement a new personal boundary for VR avatars in both Horizon Worlds and Horizon Venues.
As explained by Meta:
“Personal Boundary prevents avatars from coming within a set distance of each other, creating more personal space for people and making it easier to avoid unwanted interactions. Personal Boundary will begin rolling out today everywhere inside of Horizon Worlds and Horizon Venues, and will by default make it feel like there is an almost 4-foot distance between your avatar and others.”
This is why we can’t have nice things.
Of course, functionally, that doesn’t change much in the current VR space, it’s only disappointing in the fact that we need such measures at all. But again, evidently, we do, and with Meta seeking to convert as many people as it can over to its new, more immersive connection spaces – especially with its main app now losing active users – it’s obviously felt the need to implement such protection measures immediately to avoid any further harm and negative reports.
Because as Jeff Goldblum’s character notes in Jurassic Park: “nature finds a way”, which works in both a positive and negative sense. Social media platforms have provided more ways to stay connected with others than ever before, we’re now more able to find more like-minded people, learn more about other cultures, and explore individual niches and interests in ways that simply weren’t possible in times past.
But social media has also facilitated the formation of increasingly harmful groups, the concerted harassment of people with dissenting opinions, the spread of misinformation and disinformation at huge scale, and the objectification and violation of users for any reason that people may choose.
Users should not have to deal with these elements, we should, in theory, be able to utilize these technologies for good, which has been the underlying hope of social media CEOs and visionaries, who’ve often seemingly turned a blind eye to the flip-side of the coin. But the impact of such harms is significant, arguably more significant than the positives, on balance.
But there’s no going back now, social platforms are already embedded into how we interact, which means that the host providers simply have to work at improving their systems to cater for misuse, and counter it wherever they can.
It’s not possible to eliminate such behavior entirely. Again, this is human nature, and as Meta’s executives have repeatedly noted, its platforms are merely a reflection of society and broader societal trends. It’s not Meta’s fault that people have negative impulses and choose to project them via its apps.
But then again, it also is – which is why Meta is doing all it can to address these issues.
VR opens up all new forms of harassment, and will provide a medium for many more incidents like this. And that’s before we get into the more questionable use cases for VR technology, and the impacts that they might have on people’s behavior.
Surely putting users into a more immersive, virtual environment where they can harass and demean people, and commit fictional crimes, is not great for their mental approach to real life, and how they can act in public. Yet, that’s very likely where we’re headed, with Meta set to launch Grand Theft Auto in VR sometime this year.
It does look like an interesting and engaging gaming experience. But the way that characters are treated in GTA is overly negative, and various studies have shown that playing violent video games in 2D, especially GTA, form can increase aggressive behaviors, and desensitize people to violence.
I can only imagine the same applies more directly to a fully immersive experience like this. Of course, GTA VR will be rated R, and will only, theoretically, be available to adults. Just like every other GTA game.
It’s a major concern – when you’re building an alternative world, with more stimulants and more inputs to immerse yourself into an entirely different environment, that also cranks up the risk factors, and could lead to much bigger mental and developmental impacts in different ways.
But again, tech CEOs seem blinded by the positives and the potential of what’s to come. This will replace real-world interactions, and create all new ways to interact, and to share unique experiences with your loved ones, reducing loneliness and enabling virtually anything that you can dream of.
But not all dreams are filtered through a positive lens, and not all people will be aligned in the same approach.
Overlooking the negatives might help Meta make more money, but it will also lead to more real-world harm, in many ways.
Building in buffer zones for avatars is a disappointingly necessary development. But it’s likely only the start.