Top of page
Technology

Harvard students develop wearable navigation robot for people with vision disabilities

a user with guide dog testing the wearable device
This mock-up shows how a user would wear the Foresight navigational aid. Photo: Foresight

Harvard University students developed a wearable robotic device for people who are blind or have low vision to navigate more easily.

Students has launched a startup called Foresight, a wearable navigation aid for people who are blind that uses cutting-edge soft robotics and computer vision technology. Foresight connects to a user’s smartphone camera, which is worn around the neck. It detects objects nearby and this information triggers soft textile units on the body that inflate to provide haptic feedback as objects approach and pass.

“Being diagnosed with a permanent eyesight problem means getting used to a whole new way of navigating the world,” said Ed Bayes, a student in Harvard’s MDE program.

“We spoke to people living with visual impairment to understand their needs and built around that.”

The startup was born out of the joint SEAS/GSD course Nano Micro Macro, in which teams of students are challenged to apply emerging technology from Harvard labs. Foresight was inspired by students’ experience with blind family members who spoke of the stigma surrounding the use of assistive devices.

“Importantly, Foresight is discreet, affordable, and intuitive. It provides an extra layer of comfort to help people move around more confidently,” Bayes added.

“Most wearable navigation aids rely on vibrating motors, which can be uncomfortable and bothersome to users,” said Anirban Ghosh, M.D.E. ’21. “Soft actuators are more comfortable and can provide the same tactile information.”

With Foresight, the distance between an object and the user correlates with the amount of pressure they feel on their body from the actuators.

“The varied inflation of multiple actuators represents the angular differences of where those objects are in space,” said Nick Collins,

“We want to know if the information we are giving them actually translates into a user-friendly interpretation of objects in the space around them,” Collins said. “This is another tool in their arsenal. Our desire isn’t necessarily to get rid of any current tools, but to provide another, more robust sensory experience.”

The team is inspired by the opportunity to use emerging technology to provide an implementable solution that could help many people.

You might also like

Person using Bento Arm Person using Bento Arm

Researchers develop AI-powered prosthetic “Bento Arm”

A University of Alberta research team has developed a prosthetic…

Guide Dog Puppy Guide Dog Puppy

Canada invests in guide dog training for persons with disabilities

Accessible infrastructure and meaningful improvements to facilities ensures more Canadians…

blind student reading using Orbit Braille Reader blind student reading using Orbit Braille Reader

Internet outages and the impacts on persons with disabilities

The Australian Communications and Media Authority (ACMA) has been consulting…

woman and grandson with using a walker during rehabilitation woman and grandson with using a walker during rehabilitation

Researchers develop wearable device to predict stroke risk

A team of researchers from the Keck School of Medicine…