The visually impaired might soon be able to “see” the objects around them using their hands, all thanks to a very impressive robotic wearable that is being developed by a team of researchers from the University of Arkansas and the University of Nevada. The robotic “glove” in question will allow users to get a better sense of what they are reaching for through haptic and audio feedback. According to assistant professor and lead researcher on the project, Yantao Shen, the team can pre-map a potential user’s hand and create a lightweight form-fitting device packed with cameras, electrical sensors and mechanical sensors.
The combination of tactile sensors, temperature sensors, miniaturized microphones and high-resolution cameras will analyze the object that the user is reaching for and will deliver information regarding its precise location, shape and size. Even though the tech is still in its early development phases and there’s no word on when it will be released, it’s nice to know that someone is working on making things easier for the blind and visually impaired. Describing this upcoming robotic wearable, Shen stated the following:
“Not only will this device help blind and visually impaired people, the methods and technology we develop will have great potential in advancing small and wearable robot autonomy.”