Skip to main content
Hamburger Menu Close

Advertisement

Living

Welcome to the dark side of the metaverse: Here come harassment and assaults

Harassment, assaults, bullying and hate speech already run rampant in virtual reality games, which are part of the metaverse, and there are few mechanisms to easily report the misbehaviour, researchers said.

Welcome to the dark side of the metaverse: Here come harassment and assaults

Experts said that misbehaviour in virtual reality is typically difficult to track because incidents occur in real time and are generally not recorded. (Photo: iStock/anon-tae)

Chanelle Siggens recently strapped on an Oculus Quest virtual reality headset to play her favorite shooter game, Population One. Once she turned on the game, she manoeuvred her avatar into a virtual lobby in the immersive digital world and waited for the action to begin.

But as she waited, another player’s avatar approached hers. The stranger then simulated groping and ejaculating onto her avatar, Siggens said. Shocked, she asked the player, whose avatar appeared male, to stop.

“He shrugged as if to say: ‘I don’t know what to tell you. It’s the metaverse – I’ll do what I want,’” said Siggens, a 29-year-old Toronto resident. “Then he walked away.”

The world’s largest tech companies – Microsoft, Google, Apple and others – are hurtling headlong into creating the metaverse, a virtual reality world where people can have their avatars do everything from play video games and attend gym classes to participate in meetings.

In October, Mark Zuckerberg, Facebook’s founder and chief executive, said he believed so much in the metaverse that he would invest billions in the effort. He also renamed his company Meta.

Yet even as tech giants bet big on the concept, questions about the metaverse’s safety have surfaced. Harassment, assaults, bullying and hate speech already run rampant in virtual reality games, which are part of the metaverse, and there are few mechanisms to easily report the misbehaviour, researchers said.

When something bad happens, when someone comes up and gropes you, your mind is tricking you into thinking it’s happening in the real world.

In one popular virtual reality game, VRChat, a violating incident occurs about once every seven minutes, according to the nonprofit Center for Countering Digital Hate.

Bad behaviour in the metaverse can be more severe than today’s online harassment and bullying. That’s because virtual reality plunges people into an all-encompassing digital environment where unwanted touches in the digital world can be made to feel real and the sensory experience is heightened.

“When something bad happens, when someone comes up and gropes you, your mind is tricking you into thinking it’s happening in the real world,” Siggens said. “With the full metaverse, it’s going to be so much more intense.”

THE FAR-REACHING IMPACT OF BAD BEHAVIOUR IN VIRTUAL REALITY

Toxic behaviour in gaming and in virtual reality is not new. But as Meta and other huge companies make the metaverse their platform, the issues are likely to be magnified by the companies’ reach over billions of people. The companies are encouraging people to join the metaverse, with Meta, which makes the Oculus Quest headsets, cutting prices for the products during the holidays.

Zuckerberg, who appears aware of questions about the metaverse’s harms, has promised to build it with privacy and safety in mind. Yet even his own lieutenants have wondered whether they can really stem toxic behaviour there.

In March, Andrew Bosworth, a Meta executive who will become chief technology officer in 2022, wrote in an employee memo that moderating what people say and how they act in the metaverse “at any meaningful scale is practically impossible”. The memo was reported earlier by The Financial Times.

Kristina Milian, a Meta spokesperson, said the company was working with policymakers, experts and industry partners on the metaverse. In a November blog post, Meta also said it was investing US$50 million in global research to develop its products responsibly.

Meta has asked its employees to volunteer to test the metaverse, according to an internal memo viewed by The New York Times. A stranger recently groped the avatar of one tester of a Meta virtual reality game, Horizon Worlds, a company spokesperson said. The incident, which Meta has said it learned from, was reported earlier by The Verge.

"DIFFICULT TO TRACK AND DOCUMENT" 

Misbehaviour in virtual reality is typically difficult to track because incidents occur in real time and are generally not recorded.

Titania Jordan, the chief parent officer at Bark, which uses artificial intelligence to monitor children’s devices for safety reasons, said she was especially concerned about what children might encounter in the metaverse. She said abusers could target children through chat messages in a game or by speaking to them through headsets, actions that are difficult to document.

“VR is a whole other world of complexity,” Jordan said. “Just the ability to pinpoint somebody who is a bad actor and block them indefinitely or have ramifications so they can’t just get back on, those are still being developed.”

Callum Hood, head of research at the Center for Countering Digital Hate, recently spent several weeks recording interactions in the VRChat game, which is made by a developer called VRChat and largely played through Oculus Quest headsets.

In the game, people can form virtual communities and have their avatars play cards, party in a virtual club or meet in virtual public spaces to talk. Oculus rates it as safe for teenagers.

Yet over one 11-hour period, Hood said, he recorded more than 100 problematic incidents on VRChat, some involving users who said they were under 13. In several cases, users’ avatars made sexual and violent threats against minors, he said. In another case, someone tried showing sexually explicit content to a minor.

MORE CAN BE DONE TO PUT MEASURES IN PLACE

Hood said the incidents had violated Oculus’ terms of service, as well as those of VRChat. He said he had reported his findings to both companies but had not heard back.

“VRChat is unsafe because its developers and Facebook have failed to put basic measures in place to ensure abusive users cannot access its services,” he said. “They have created a safe haven for abusive users at the same time as inviting minors to enter the metaverse.”

Milian said Meta’s community standards and VR policy outline what is allowed on its platform, which developers must adhere to. “We don’t allow content that attacks people based on race, ethnicity, national origin, religious affiliation, sexual orientation, caste, sex, gender, gender identity, and serious disease or disability,” she said.

Minors are not permitted to create accounts or use Oculus devices, she said. Part of the responsibility, she added, lies with the developers of the apps.

VRChat did not respond to a request for comment.

After Siggens faced abuse while playing the Population One virtual reality game, she said, she joined a virtual support group for women, many of whom also play the game. Members regularly dealt with harassment in the game, she said. In June, Meta acquired BigBox VR, the developer of Population One.

Another member of the support group, Mari DeGrazia, 48, of Tucson, Arizona, said she saw harassment and assault happen in Population One “two to three times a week, if not more.”

“Sometimes, we see things happen two to three times day that violate the game’s rules,” she added.

BigBox VR did not respond to a request for comment.

DeGrazia said the people behind Population One had responded to her complaints and appeared interested in making the game safer. Despite the harassment, she said, she has found a community of virtual friends whom she regularly plays the game with and enjoys those interactions.

“I’m not going to stop playing, because I think it’s important to have diverse people, including women, playing this game,” she said. “We aren’t going to be pushed out of it, even though sometimes it’s hard.”

In July, DeGrazia wore a haptic vest – which relays sensations through buzzes and vibrations – to play Population One. When another player groped her avatar’s chest, “it felt just awful,” she said.

She noted that Zuckerberg has described a metaverse where people can be fitted with full-body suits that let them feel even more sensations, which she said was troubling.

Siggens said she had ultimately reported the user account of the person who groped her in Population One through a form within the game. She later received an automated response saying punitive action had been taken against the user.

“I don’t know if they were banned for a day or for a week or for forever,” she said. “Either way, it just keeps happening.”

An hour after the incident with the stranger’s avatar, Siggens said, her avatar was groped again by a different user.

By Sheera Frenkel and Kellen Browning © The New York Times

This article originally appeared in The New York Times.

Source: New York Times/ss

Advertisement

RECOMMENDED

Advertisement