Metavers already has a grouping problem


Katherine Cross, who conducts research on online harassment at the University of Washington, says that while virtual reality is submerged and real, the toxic behavior that occurs in that environment is also real. “At the end of the day, the nature of virtual-reality spaces is such that it is designed to deceive the user into thinking that they are physically in a certain place, each of their physical actions taking place in a 3D environment,” he says. “It’s part of the reason why emotional responses can be strong in that space and why VR triggers the same internal nervous system and psychological responses.”

This is true of the woman who was caught at Horizon Worlds. According to The Verge, his post reads: “Sexual harassment is not a regular joke on the Internet, but adds another layer to the VR that makes the incident worse. Not only was I stopped last night, there were other people who supported the behavior The plaza felt isolated [the virtual environment’s central gathering space]”

Sexual harassment and harassment are not new in the virtual world, or it is unrealistic to expect a world in which these problems will disappear altogether. As long as those who are hiding behind the computer screen to avoid moral responsibility, they will continue to happen.

The real problem, perhaps, has to do with the perception that when you play a game or participate in a virtual world, Stanton describes it as a “contract between developer and player.” “As a player, I agree to be able to do whatever I want in the developer’s world according to their rules,” he said. “But as soon as that contract is broken and I no longer feel comfortable, the company’s obligation is to get the player back to where they want to be and to be comfortable.”

The question is: Who is responsible for ensuring the comfort of the users? Meta, for example, says it gives users access to tools to keep themselves safe, effectively transferring responsibility to them.

Meta spokeswoman Christina Millian said: “We want everyone at Horizon World to have a positive experience with security tools that are easy to find. And it’s never their fault if users don’t use all the features we offer.” “We need to improve our UI and better understand how people use our tools so that users can report things easily and reliably. Our goal is to secure Horizon Worlds, and we are committed to doing just that. “

Millian says users will have to go through an onboarding process before joining Horizon Worlds, which teaches them how to enter safe zones. Regular reminders at Horizon Worlds are loaded on screens and posters, he added.

Screenshot of the Safe Zone interface from Meta

Facebook

Screenshot of the Safe Zone interface
Screenshot of the Safe Zone interface courtesy of Meta

Facebook

But the fact is that the victim of meta grouping either thought of using Safe Zone or could not access it. Exactly The problem, says Cross. “The structural question is a big problem for me,” he says. “Generally speaking, when companies talk about online abuse, their solution is to outsource it to the user and say, ‘Here, we give you the power to take care of yourself.'”

And it is unfair and does not work. Security should be simple and accessible and there are plenty of ideas to make it possible. To Stanton, a kind of universal signal in virtual reality সম্ভবত perhaps the Quiver’s V gesture যা could relay to the moderators that something was wrong. Fox thinks that automatic personal distance will help if two people do not agree to be close to each other. And Cross believes that it will be effective for training sessions that clearly reflect the rules that exist in ordinary life: “In the real world, you should not randomly grab someone and you should take it to the virtual world.”

Until we realize that it is a matter of protecting users, a big step towards a secure virtual world is to discipline the attackers, who are often scott-free and able to participate online even after knowing their behavior. “We need resistance,” Fox says. This means making sure that bad actors are found and suspended or banned. (Million says Meta “[doesn’t] Share specific information about individual cases “when asked what happened to the accused gripper.)

Stanton regrets that the power gesture is not pushing for an industry-wide adoption and has failed to talk further about Belamir’s whirlwind incident. “It was a lost opportunity,” he says. “We could have avoided that incident to meet.”

If anything, it is: In the virtual world, there is no body that is explicitly responsible for the rights and security of those who participate online. Until something changes, Metavers will remain a dangerous, problematic place.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *