For user experience designers, virtual reality is a brave new world. The conversation around managing user behaviour in virtual reality comes in the wake of a woman who was sexually assaulted in a VR game. Because the technology is so new, design best practices and standards around harassment, assault, and abuse have not been established.

user-behaviour-in-virtual-reality

Unfortunately, online harassment is nothing new as we’ve seen countless times in forums, twitter, and Xbox live headsets. 40% of internet users have personally experienced online harassment. What makes harassment in VR different is the immersive experience of it. VR is supposed to make you experience another world as if it were real. People attempting to save a cat from a plank are genuinely terrified they will fall 200 meters. Abuse is going to feel much more real. With virtual reality becoming more and more main stream this problem is only going to get worse.

Since it began, people have been trying to control the onslaught of online harassment from different angles. Cyber bullying laws have come out to punish offenders and moderated discussion boards to remove harassing messages. UX designers have been brought in as well. One example is requiring social media logins to comment on articles. The idea is to apply public social pressure to those who would otherwise leave hateful messages. Another is giving the ability to users to flag offensive content for moderation. These are established techniques that have varying effectiveness within a relatively established medium.

Virtual reality is different.

Traditional UX design considers one single user at a time. The designer’s job is to make that person’s experience easy and intuitive. While thousands of people may be doing the same thing simultaneously, the focus is always on the individual user’s experience.

There are established theories and best practices on how to manage a single user’s behaviour when the only uncontrollable element is that single user. But we’re dealing with virtual reality. We’re dealing with thousands of users who not only with the product but with each other. It becomes much more challenging to influence their behaviour. There is a lot less control.

This raises the question about whose responsibility it is to influence and moderate user behaviour in virtual reality. Other than the users themselves, of course. VR game developers certainly, and the developers behind the game in which the woman was assaulted have responded quickly. They proposed a ‘power gesture’ to simulate the opening of a force field that would cause nearby players to vanish. A similar idea to blocking someone. This puts immediate control in the hands of the victim, and is certainly a step in the right direction. But is it enough?

What can we do?

How can UX designers affect the VR user experience in order to prevent harassment from happening in the first place? How do you design an ‘offensive behaviour’ flagging system in VR? What kind of repercussions exist to users who behave offensively?

These are tough questions. In real life we have a hard enough time preventing abuse, harassment, and assault. However, with VR, since we can at least control the environment, maybe we can influence people to behave better.

Flagging and Moderation

With the gesture-based nature of virtual reality perhaps a gesture could be developed to identify users who exhibit offensive behaviour. One suggestion is that a user point at the offensive party and raise their other hand to signal flagging. A visual and verbal snapshot or micro-recording of the interaction between the two users could be sent to a moderator. The moderator would be able to see the context in which the offensive user was flagged. The appropriate repercussions could then be applied.

Depending on the context of the flagging there could be varying severity of consequences. A moderator could ban the user for a period of time. Freeze their account or IP address indefinitely. Send the information to law enforcement at the request of the offended user. Current cyber bullying laws cover current online interactions and could possibly be extended into virtual reality.

Removing Anonymity

Psychology tells us that people feel less connected to the human being on the receiving end of online abuse, in part, because of the lack of body language. We are unsure of the other person’s intent and so we unconsciously react negatively to this ‘perceived threat’. With this in one psychological tidbit in mind, how can designers and developers work together to reduce the disconnect?

We could link virtual reality accounts to current social media accounts, similar to commenting requirements. In theory, this could remove some of the anonymity, humanize other users, and encourage users to behave better. They can see the person who is at the other end of their behaviour and have an easier time empathizing. Or would it open up users to increased prejudice and abuse?

Testing with a Diverse Group

Regardless of the methods selected to reduce abuse, they need to be designed and tested with a diverse user group over a longer period of time. Sadly, women have long been underrepresented in tech so their experiences and perspectives often don’t come up as frequently. And since 40% of all internet users have experienced harassment online, these situations should be considered ‘common scenarios’ in testing.

These testing scenarios should look at the how each method affects both victims and abusers. With the force field gesture: does the user disappearing from view change the power balance? Does the abuser win? Does this become an incentive for abusers to see how many users they can make ‘disappear’?

Final Thoughts…

You will never be able to fully control how a user acts or behaves. You can influence them one way or another by manipulating what they interact with. A combination of removing the abuser immediately, flagging them for moderation, and having appropriate repercussions is a good start. This is a big challenge for UX designers who are now working in this unfamiliar territory. The onus is on them to set the standards that will become best practices moving forward.

Share itShare on FacebookEmail this to someoneShare on Google+Share on LinkedInTweet about this on Twitter