Top of page
Technology

Smart glove translates American sign language into speech

ASL sensor glove with smartphone

UCLA scientists have created a glove that translates American Sign Language into speech in realtime through a smartphone app.

“Our hope is that this opens up an easy way for people who use sign language to communicate directly with non-signers without needing someone else to translate for them,” said Jun Chen, an assistant professor of bioengineering at the UCLA Samueli School of Engineering and the principal investigator on the research. “In addition, we hope it can help more people learn sign language themselves.”

The system includes a pair of gloves with thin, stretchable sensors that run the length of each of the five fingers. These sensors, made from electrically conducting yarns, pick up hand motions and finger placements that stand for individual letters, numbers, words and phrases.

The device then turns the finger movements into electrical signals, which are sent to a dollar-coin–sized circuit board worn on the wrist. The board transmits those signals wirelessly to a smartphone that translates them into spoken words at the rate of about a one word per second.

The researchers also added adhesive sensors to testers’ faces — in between their eyebrows and on one side of their mouths — to capture facial expressions that are a part of American Sign Language.

Previous wearable systems that offered translation from American Sign Language were limited by bulky and heavy device designs or were uncomfortable to wear, Chen said.

The device developed by the UCLA team is made from lightweight and inexpensive but long-lasting, stretchable polymers. The electronic sensors are also very flexible and inexpensive.

You might also like

A womn in a wheelchair using a computer A womn in a wheelchair using a computer

How technology advances accessibility for people with disabilities

In today’s fast-evolving technological setting, the impact of technological progress…

Sign Language Sign Language

How AI can help map sign languages

Like spoken languages, sign languages evolve organically and do not…

kid infront of computer screen kid infront of computer screen

UNMC’s Munroe-Meyer Institute introduces autism diagnostic tool

The UNMC Munroe-Meyer Institute is piloting a new diagnostic tool…

Hussein Alawieh, a graduate student in Dr. José del R. Millán's lab, wears a cap packed with electrodes that is hooked up to a computer. The electrodes gather data by measuring electrical signals from the brain, and the decoder interprets that information and translates it into game action. Hussein Alawieh, a graduate student in Dr. José del R. Millán's lab, wears a cap packed with electrodes that is hooked up to a computer. The electrodes gather data by measuring electrical signals from the brain, and the decoder interprets that information and translates it into game action.

Universal brain-computer interface enables thought-controlled gaming

Imagine playing a racing game like Mario Kart, using only…