Finding solutions to help blind people in their daily lives is an objective pursued by multiple research teams worldwide.
Some of these groups explore methods to try to restore, at least in part, lost vision. While others try to find ways to compensate for this lack of vision by converting environmental information into messages directed to other senses besides sight to guide those affected in the world they live in. This week, in the journal Nature Machine Intelligence, details are published about a device that, following this second line of research, allows guiding blind people so they can navigate any obstacle in their path.
The device, which the person can wear as if they were simple glasses, allows, thanks to artificial intelligence algorithms, interpreting external information and guiding the user through voice commands and skin stimuli to prevent them from stumbling over any object.
Specifically, the device, developed by researchers at Shanghai Jiao Tong University in Shanghai (China), converts the image data captured by a built-in camera in the device into auditory indications that reach the user through headphones and vibrations that guide their direction and movements through artificial skin placed on their wrist.
Gene therapy with CRISPR 'in vivo' improves vision in hereditary blindness
Luis hopes to halt his blindness with the newly approved drug for macular degeneration: "I'm eager"
The scientists, led by Leilei Gu, first tested the usefulness of the device on humanoid robots with a walking capacity similar to humans and later on people who suffered from both partial vision loss and complete blindness. The researchers conducted experiments in different environments, both virtual and real, that aimed to emulate everyday situations that blind people may encounter in their daily lives.
They tested the device's ability to guide people with visual impairments in three different scenarios. One of them required several turns and changes of direction. They first conducted the experiments with a group of 12 patients and then validated the strategy with another eight patients, all with severe visual impairments. In this case, scenarios were set in both open and closed environments, with the presence of static and moving obstacles, and in settings that simulated offices and other work environments.
The experiments conducted demonstrated the usefulness of the device. Before using the developed 'glasses', the vast majority of the individuals studied stumbled over objects in the room or could not complete the set route independently. In contrast, with the use of the device, all were able to complete the assigned tasks. Furthermore, as they conducted experiments, their ability to walk following the device's instructions improved, allowing them to complete predetermined tasks more quickly and efficiently. The individuals who participated in the study positively assessed the usability of the device.
The findings, as noted by the researchers in the scientific journal, suggest that the integration of tactile and audio technology components provides effective visual support.
Overall, they conclude, "the system emerges as a promising research prototype, paving the way for future visual assistance advances that individuals can use."
"Conceived as an open platform, the system is open to interdisciplinary collaboration to achieve progress, including improvements in vision models, integrated electronics, or neuroscience vision."
However, as they acknowledge, "the involvement of a larger and more diverse group of visually impaired individuals is crucial to better address their specific needs."
For Manuel Lozano, from the Biomedical Signal Processing and Interpretation (BIOSPIN) group at the Institute for Bioengineering of Catalonia (IBEC) and associate professor in the Department of Systems Engineering, Automatics, and Industrial Informatics at the Universitat Politècnica de Catalunya (UPC), the system proposed in this study presents some features that, in his opinion, represent "an interesting development for the advancement of vision support systems."
"From my point of view, the most interesting contribution of this study lies in the hardware aspect. Unlike other vision support systems based on smart glasses, such as Envision Glasses, Biel Smartgaze, or NuEyes, the system proposed in this study integrates the use of smart glasses with haptic sensors and smart templates to enhance user interaction with the environment," Lozano points out.
"The A-skin device incorporating the system complements the artificial vision part by providing a mechanism for detecting and alerting objects at close range, being particularly useful in detecting lateral obstacles. This makes the proposed system cover a wider monitoring angle than other commercial systems, supervising not only a broad central region but also a peripheral area," he explains.
"The use of such wearable technologies, implemented with ultra-thin, lightweight, and flexible materials, is on the rise and is a current research topic," continues the specialist, whose group explores the use of tattoo-type electrodes for recording physiological signals, specifically electromyographic signals for studying bruxism or respiratory muscle activity.
Regarding the limitations of the work, Lozano highlights "the need to test the system on a larger sample of users. One of the critical points of these technologies is the acceptability and usability by end-users. In this sense, having verified data on the level of acceptance and use that the proposed system would have in a large sample of users with vision problems is important."
"While there are more advanced technologies for vision support, such as artificial eyes or biomimetic artificial retinas, I believe that the proposed system is a promising development to support people with vision problems, as it offers a balanced and user-centered solution, comfortable, simple, and portable, while incorporating hardware components that differentiate it from other commercial systems on the market and improve user interaction with their environment," he concludes.