from the how-well-do-the-combat-harassment-IRL? dept.
Sydney Smith had dealt with lewd, sexist remarks for more than a month while playing the Echo VR video game. But the 20-year-old reached her breaking point this summer.
[...] Smith tried to figure out which player had harassed her, so she could file a report. But that was tough because multiple people were talking at the same time. Since she hadn't been recording the match, Smith couldn't rewatch the encounter and look for a username.
[...] Smith isn't the only virtual reality player who's had trouble reporting an ugly run-in. Though Oculus and Echo VR, both owned by Facebook, have ways to report users who violate their rules, people who've experienced or witnessed harassment and offensive behavior in virtual environments say a cumbersome process deters them from filing a report. Content moderators have to examine a person's behavior, as well as words. (Oculus' VR policy says users aren't allowed to follow other users against their wishes, make sexual gestures or block someone's normal movement.)
As Facebook focuses on creating the metaverse -- a 3D digital world where people can play, work, learn and socialize -- content moderation will only get more complex. The company, which recently rebranded as Meta to highlight its ambitions, already struggles to combat hate speech and harassment on its popular social media platforms, where people leave behind a record of their remarks. The immersive spaces such as Horizon Worlds envisioned by CEO Mark Zuckerberg will be more challenging to police.
This story is partly based on disclosures made by Frances Haugen, a former Facebook employee, to the US Securities and Exchange Commission, which were also provided to Congress in redacted form by her legal team. A consortium of news organizations, including CNET, received redacted versions of the documents obtained by Congress.
"The issue of harassment in VR is a huge one," Haugen said. "There's going to be whole new art forms of how to harass people that are about plausible deniability." The tech company would need to hire substantially more people, and likely recruit volunteers, to adequately deal with this problem, she said.
Facebook has more than 40,000 people working on safety and security. The company doesn't break down how many are dedicated to its VR platform.