Virtual reality and augmented reality are often associated with gaming. For many, interest is driven by visiting an alien world or an exotic location -- going some place and doing something they couldn't do on their own. Bridging worlds seems to be a sweet spot for these technologies.
But VR and AR don't necessarily have to take users to far-off or fictional places. In several instances, AR and VR are trying to better connect those with vision issues to the everyday word. Most take for granted the ability to sit at a computer at work and read the text on the screen, or freely and confidently walk around their home or office. These technologies aim to help people reclaim some of the vision they may have lost, and make it easier to function in the world.
The World Health Organization estimates that 246 million people have low vision, which includes blurry vision, tunnel vision or blind spots that can't be corrected. WHO also estimates 90 percent of those who have vision problems, not just low vision, tend to have low incomes.
Frank Werblin, professor of neuroscience at the University of California, Berkeley, is working to bring a lower cost vision aid to the low vision community. About a year and a half ago, he realized he could do this by piggybacking on virtual-reality technology.
IrisVision is an app that uses a Samsung Gear VR headset. It's responsive to the wearer's head movements and will magnify whatever they're directly looking at, while still providing a wide field of view. It's meant to help users better see the world, even read on a computer.
One of the biggest challenges he's trying to address is cost. Wearable vision aids can go as high as $15,000.
"There's a huge price gap between a magnifying glass which you could buy for $25 or $50 and what these people could really use, which is a wearable portable device, which is many thousands of dollars," he said.
IrisVision is available online for $2,500, and in certain clinics around the country on an experimental basis. The price includes the software, headset and phone needed to power the Gear VR. Werblin, who has been studying the way the eye's retina functions for 44 years, even took it to the California School for the Blind in Fremont, California.
Lazy eye no more
Werblin's not the first to use virtual-reality hardware. James Blaha had dealt with amblyopia, or lazy eye, his whole life. In the process of researching the condition and tinkering with an early Oculus Riftheadset, the now-founder and CEO of Vivid Vision, a company that makes therapy solutions for the condition, discovered he could strengthen his weaker eye using virtual reality.
Amblyopia occurs when one eye is far less effective than the other, and the brain tries to suppress it. This creates problems with depth perception. Depth perception problems can make it hard to cross a busy street or drive, among other things. Conventional wisdom in the medical field has held that if the problem isn't fixed by the critical age of 8, then it won't be fixed at all.
Blaha is proving that idea wrong. By cranking up the brightness in the goggles for just his weaker eye, he essentially forced his brain to stop suppressing the eye. He started seeing in 3D for the first time in his life, including seeing the keys on his keyboard in relief. These days, he has 90 percent normal depth perception.
"We're sort of outside the context of VR, particularly for the patients who use it, and definitely for the doctors who are not playing any of the VR games, typically," Blaha said of his company Vivid Vision, which is now in about 50 clinics in the US and Canada, plus a few in Europe and Australia.
Vivid Vision is also working with the University of California at Berkeley on improving depth perception in adults.
On the more serious end of the vision spectrum, there's legal blindness.
VA-ST is a startup out of the University of Oxford, which makes applications for those with vision problems. Its SmartSpecs headset is built as a portable device for those who are legally blind or partially sighted.
SmartSpecs, which don't have a price just yet, boost the visibility of objects and faces by accentuating those objects and their edges against a darkened background. It even works in darkness, helping users find doorways or navigate around furniture. SmartSpecs' algorithms look for items like faces, including facial expressions, and text in real time.
"I think the real trick is coming up with ways of delivering relevant information to the user without bombarding them with irrelevant info," said co-founder Stephen Hicks, who is also a university research lecturer and Royal Academy of Engineering enterprise fellow at Oxford.
SmartSpecs wants to give users more confidence and independence. It's set to launch mid 2017.
"We have the potential to make a dent in this feeling of isolation and helplessness that many visually impaired individuals experience," he said.
Credit: CNET