VR shows promise in aiding navigation of people with blindness or low vision

By Published On: December 18, 2024Last Updated: September 23, 2025
VR shows promise in aiding navigation of people with blindness or low vision

A new study offers hope for people who are blind or have low vision (pBLV) through an innovative navigation system that was tested using virtual reality (VR).

The system, which combines vibrational and sound feedback, aims to help users navigate complex real-world environments more safely and effectively.

The research advances work toward developing a first-of-its-kind wearable system to help pBLV navigate their surroundings independently.

“Traditional mobility aids have key limitations that we want to overcome,” said Fabiana Sofia Ricci, the paper’s lead author at New York University.

“White canes only detect objects through contact and miss obstacles outside their range, while guide dogs require extensive training and are costly. As a result, only two to eight per cent of visually impaired Americans use either aid.”

In this study, the research team miniaturised the earlier haptic feedback of its backpack-based system into a discreet belt equipped with 10 precision vibration motors. The belt’s electronic components, including a custom circuit board and microcontroller, fit into a simple waist bag, a crucial step toward making the technology practical for real-world use.

The system provides two types of sensory feedback: vibrations through the belt indicate obstacle location and proximity, while audio beeps through a headset become more frequent as users approach obstacles in their path.

“We want to reach a point where the technology we’re building is light, largely unseen and has all the necessary performance required for efficient and safe navigation,” said Rizzo, who is an associate professor in NYU Tandon’s BME department.

“The goal is something you can wear with any type of clothing, so people are not bothered in any way by the technology.”

The researchers tested the technology by recruiting 72 participants with normal vision, who wore Meta Quest 2 VR headsets and haptic feedback belts while walking around NYU’s Media Commons at 370 Jay Street in Downtown Brooklyn, an empty room with only side curtains.

Through their headsets, the participants experienced a virtual subway station as someone with advanced glaucoma would see it – with reduced peripheral vision, blurred details, and altered colour perception.

The environment, created with Unity gaming software to match the room’s exact dimensions, allowed the team to determine how well participants could navigate using the belt’s vibrations and audio feedback when their vision was impaired.

“We worked with mobility specialists and NYU Langone ophthalmologists to design the VR simulation to accurately recreate advanced glaucoma symptoms,” said Maurizio Porfiri, the paper’s senior author.

Within this environment, we included common transit challenges that visually impaired people face daily – broken elevators, construction zones, pedestrian traffic, and unexpected obstacles.

Results showed that haptic feedback significantly reduced collisions with obstacles, while audio cues helped users move more smoothly through space. Future studies will involve individuals with actual vision loss.

The technology complements the functionality of Commute Booster, a mobile app being developed by a Rizzo-led team to provide pBLV navigation guidance inside subway stations. Commute Booster “reads” station signage and tells users where to go, while the haptic belt could help those users avoid obstacles along the way.

New investment to help healthcare practices and billers digitally transform
SweetchExploring key health tech trends for 2025