Hands-On Vision: How a Wrist Camera Can Expand the World for All Users
Inside the Human-Computer Integration Lab at the University of Chicago, a team of researchers are challenging how technology helps people interact with the world. Their latest research explores what happens when you shift sensory substitution devices from focusing solely on the eyes’ perspective to embracing the hands’ point of view. This work was a collaborative effort by three equal-contribution first authors: Shan-Yuan Teng (PhD student now joining National Taiwan University as faculty), Gene S-H Kim (summer intern from Stanford, soon to begin his PhD at MIT), and Xuanyou (Zed) Liu (summer intern from the University of Pennsylvania, now starting his PhD at Northwestern University), under the guidance of Associate Professor Pedro Lopes.
Sensory substitution is a concept where one sensory modality is translated to another. One example is to translate visual information to tactile feedback, which can enable a way of “seeing” for blind or low-vision (BLV) users, and this has long relied on cameras mounted near the eyes to simulate vision. But Lopes and his team wanted to explore a more dynamic approach. They created a device that lets users “see with their hands,” combining a wrist-mounted camera and an electrotactile display on the back of the hand to provide tactile signals for object identification and interaction. Essentially, the hand-camera captures an image of an object, like a bottle or a pen, and translates it into touchable patterns on the wearer’s hand—allowing them to explore the world by hovering their hand over physical spaces.
This innovation shifts sensory substitution beyond navigation to include hands-on precision tasks. Sighted and BLV participants tested the device, completing tasks such as picking up items, aiming a bottle cap onto a bottle, and even locating a person for a handshake using only the tactile signals. The researchers found that this hands-focused approach offered unique advantages, particularly for ergonomics and detailed exploration.
“I could imagine the hand-camera as an extension of my arm—it felt like my hand was seeing,” shared one participant during the trials.
The design of this system and its experiments were heavily influenced by Kim’s lived experience as a blind person, ensuring its relevance to the BLV experience.
Participants first compared the effectiveness of the wrist camera to a traditional forehead-mounted camera setup, and later had the freedom to use both devices simultaneously. Many chose to combine the perspectives, using the eye-device for larger spatial awareness and switching to the hand-device for finer details.
“As a blind person, I’m able to do just about anything my able-bodied peers are able to do, but I rely on non-visual strategies,” said Kim. “If I place my keys on the table and forget exactly where they are, I’ll gently sweep my hands across the table until it collides with the key. But if I drop my keys on the floor, (…) I don’t want to sweep my hands across the floor in the same way. (…) So what if I could feel objects without touching them?”
The trials uncovered clear benefits of the hands’ perspective. Users found the wrist-mounted camera more natural for reaching and interacting with objects, as it allowed intuitive movements without awkward postures like leaning or craning their neck. The ergonomic advantage meant participants were less physically strained and could engage in tasks more fluidly.
“Designing this electrotactile system specifically for blind or low-vision users was particularly meaningful and challenging for me as the interactive system relies exclusively on tactile feedback,” expressed Liu.
Moreover, BLV participants valued the device for its potential in real-world applications, envisioning tasks like retrieving dropped objects, navigating stairs, or even exploring crowded public spaces alongside traditional mobility tools like canes.
“BLV participants could envision using the device in everyday life as it feels like a natural add-on to their existing non-visual strategies,” said Teng. “For instance, one participant shared this after the study: ‘Earlier today I dropped my cane, and I had to crawl on the floor. But if I had one of these devices I could follow the electrotactile sensations.’”
Interestingly, the study also revealed how flexible humans are in adapting to entirely new sensory experiences. Despite never having used sensory substitution devices, participants instinctively developed strategies like “scanning” with the device by rotating their wrists or leveraging tactile feedback to memorize spatial layouts. Some described this as “drawing a mental map” of the environment while they worked.
Researchers believe that this ability to adapt opens exciting possibilities for future devices. Could people “see” with their fingers or their feet? What would happen if multiple limbs provided tactile perspectives simultaneously? These ideas, though highly experimental, extend the horizon of what sensory substitution can achieve, blending human adaptability with cutting-edge technology.
“Sensory substitution has been explored in the field of neuroscience since the 1960s. This work is an example to bring attention to the understanding of user experiences, and to expand the possibilities of how users might use them actively in their lives, such as for precise manual tasks like assembly and soldering.” Teng added.
Despite the promising results, the research team acknowledges limitations in their study. The tasks tested were relatively narrow in scope, and further iterations of the device will require refining camera placements, tactile resolution, and machine-learning algorithms for greater precision. Privacy concerns are another avenue for exploration since camera-based tools inherently involve capturing external environments.
Still, the HCI lab aims to explore broader applications in accessibility and beyond. For people working in tight spaces like surgeons or mechanics, or navigating unfamiliar terrain, tactile technology could provide valuable enhancements. Sighted participants in the study imagined using the hands-device for tasks like climbing in the dark or searching for objects underwater, reinforcing its versatility.
By allowing users to “see” in entirely new ways, this work from the Human-Computer Integration Lab introduces exciting possibilities for both accessibility and human-computer interaction. Whether you’re blind, low-vision, or sighted, the team’s ambitious exploration of the hands’ perspective offers a glimpse into the future of touch and technology.