AR/VR headset users are at risk of revealing sensitive information via speech, new Rutgers study finds
Rutgers researchers are currently studying the security vulnerabilities of augmented reality and virtual reality (AR/VR) headsets, which could allow hackers to steal users' sensitive information, according to a press release.
Yingying Chen, a professor in the Department of Electrical and Computer Engineering, led the creation of a tool called “Face-Mic” to demonstrate how hackers could infer sensitive information by making use of users’ speech-associated facial movements while wearing AR/VR headsets, according to the release.
These AR/VR headsets use voice-based and gestural commands in combination with motion sensors to identify users with speech recognition and subtle facial movements, Chen said.
The sensors, which include accelerometers and gyroscopes, are always active when the headset is in use, as opposed to microphones that usually require explicit user permission to record data, she said.
Specifically, the researchers found that these motion sensors collect data regarding bone-borne vibrations, speech-associated facial movements and airborne vibrations to isolate gender and identity information about users without their permission, according to the release.
Chen said this actually poses a severe security issue as the data collected by the headsets could be used to identify the user, classify their gender or age and present them with targeted advertisements. In addition, this constitutes a breach of user privacy, especially since users are typically unaware that their data is being tracked, she said.
She said that she anticipates a broader range of uses for the AR/VR headsets, which could include immersive gaming, commercial transactions and even online banking.
“We have already seen some (applications) that support online shopping," Chen said. "When people are playing games, they (are shown) advertisements (and) then they can just click … and can purchase things during the games.”
The headsets are also utilized for educational purposes, such as surgery demonstrations and various academic presentations, she said.
Chen said that people are typically unaware of the security risks associated with using these headsets and do not know when their information is being stolen, as it can happen through clicking seemingly innocent links.
“(The attackers) are going to have a certain kind of malware that can be put into a website, and (when the user) is gaming, they can go to that website,” she said. “(Users) just innocently press anything, any key or any link, and it is automatically downloaded into the headset. Then this malware will help to collect all this information.”
People are more at risk of having their information stolen by attackers when credit card numbers, passwords, birth dates or social security numbers are communicated over the headsets, Chen said.
The security issues with these headsets need to be fixed by manufacturers, she said, as users may have difficulty disabling motion sensors and may accidentally disrupt headset functionality.
Another alternative is for manufacturers to include generic data with the motion sensor data to confuse hackers and prevent them from gaining access to user-centric information, Chen said. This method, though, could also degrade the original functionality of the data.
In regard to how users can better protect themselves while using headsets, she said that they should avoid giving out sensitive information like passwords and social security numbers.
While there are pressing security concerns with the headsets, the same facial motion detectors could potentially be used to authenticate users and prevent others from accessing sensitive information, Chen said.
In addition, the motion sensors can be used to monitor users' vital signs, like abnormal breathing rates, she said. This is important because it could alert users to excessive anxiety or stress during long periods of gaming that could cause high blood pressure or a heart attack.
Ultimately, Chen said she believes that it is important to address the ethical concerns regarding these headsets and understand the significant security risks.
“We want the general public to be aware of the possible vulnerabilities of the headsets,” she said. “The other side is that we want to alert the headset manufacturers so that ... they can think about this and learn how to avoid the problems.”