There are several creepy and concerning elements to Facebook's latest VR project, with the company looking to build a more engaging, realistic personal presence within digital worlds. Which sounds good, I guess, it sounds fine, but in reality, it looks a little like this:
That's creepy - not only is there the uncanny valley feel, where it's a little too close to reality, but also just off, making it this kind of distorted alternate vision. But its also just... weird.
This is among the latest variation of Facebook's in-development realistic VR avatars, which are part of The Social Network's plans to make VR interaction as life-like as possible.
As explained by Facebook:
"If telepresence lets you feel like you’re somewhere else, then social presence lets you share that sensation with other people. [...] We colloquially refer to this as passing the ‘ego test’ and the ‘mother test. You have to love your avatar and your mother has to love your avatar before the two of you feel comfortable interacting like you would in real life. That’s a really high bar.”
So Facebook wants your Avatar to look realistic enough to convince your mother to feel for it. I mean, that's cool, but still. Creepy.
Not only that, but you can just imagine the varying ways in which such avatars could be misused and manipulated, much as Facebook itself has been in recent times. Which is why it's also a little concerning to read comments from Facebook like:
"“We’ve considered all possible use cases for this technology,” says [Chuck] Hoover [of FRB]. “We’re aware of the risk and routinely talk about the positive and negative impacts this technology can have.”
If Facebook comes at this technology assuming that it's safeguarded against every possible use-case, you can safely bet that they won't have, and all sorts of impersonation and related issues will occur. Likely, they'll occur no matter what Facebook does, but still, it's probably not great for Facebook to be touting its capability in this regard.
The idealistic goal of this technology - which, it's worth noting, is still some years away from being rolled out for general use - is to create sci-fi like connection capacity. Think the hologram messages in Star Wars, or the full body projections Superman sees of his father. That's amazing, it's astounding to consider that we're not that far from such being a reality, but if the last few years have shown us anything, its that the ideals of greater connection and the realities don't always neatly connect. If there's a way for people to benefit from the same, they will do so, and with all the controversy around how Facebook has allowed the misuse of personal data, it's hard to imagine those same users will welcome another invasive measure, like full-body scanning, in order to provide a more engrossing virtual experience.
But it's early days - there's still a long way to go before Facebook starts putting out a call for users to scan themselves in. That's why Facebook needs to build consumer trust now, it needs to mend the bridges burned by the Cambridge Analytica and Russian manipulation scandals and set about reassuring people that their uploaded information is safe. That no one will access their private information and use it against them. That no one will be able to log on as them and speak through their avatar.
It's some way off, but this will be another major point of concern for regulators and government officials, and another prompt for further oversight into how such is handled.
User privacy concerns are only going to become more significant - and a little more creepy. Whether our legal and regulatory systems are prepared for the same is another key question.