Skip to main content
Stories

Headset Harms

NC State computer scientists are studying how to ensure safety in virtual reality.

An illustration of a virtual reality, or VR, headset.
Illustration by Neil Webb

When you put on a virtual reality (VR) headset, it looks back at you. Sensors track things like the movement of your eyes, temperature in the room and gestures you make. It takes data to create the immersive experience VR promises, and a single session can generate more than a million data points on the person using it. That introduces potential risks that users, VR developers and policymakers are only beginning to understand. Entering Meta’s Horizon Worlds, VRChat or other popular VR environments can expose users’ IP addresses and other data. And in the exchanges that happen in those worlds, minors could share information about themselves and their lives that they’d otherwise protect.

An NC State computer science research team aims to better inform all those stakeholder groups and develop systems to make VR safer.

“People don’t understand how VR is different from other technologies,” says Abhinaya S B, a third-year doctoral student in computer science. “VR is like a whole new world in three dimensions.” The sensitivity and handling of data is only one aspect of the VR-related issues that Abhinaya and Anupam Das, an assistant professor of computer science, are exploring. Harassment and bullying in VR ecosystems are also significant problems probed in two recent studies by the pair.

One paper gathers VR users’ experiences of harassment, as well as software developers’ struggles to build safety controls into VR systems. The other, spurred by the growing use of VR by minors, probes parents’ perceptions of their children’s safety.

Abhinaya and Das found that parents were more focused on potential harm to their kids’ eyes than higher-risk threats like cyberbullying by other VR users. Parents and users alike were unaware of existing safety controls. Separately, the researchers found developers expressed frustration about their ability to prioritize safety, given their employers’ focus on creating features to attract new customers.

The next step for Abhinaya and Das — and the subject of Abhinaya’s dissertation — is to develop systems capable of automatically detecting harassment and either address it or alert moderators and system administrators to it.

“The existing tools,” Das says, “don’t really perform well.” 


Tell Us What You Think

Do you have a personal connection to this story? Did it spark a memory? Want to share your thoughts? Send us a letter, and we may include it in an upcoming issue of NC State magazine.