Top of page
Technology

Team programs a humanoid robot to communicate in sign language

For a robot to be able to “learn” sign language, it is necessary to combine different areas of engineering such as artificial intelligence, neural networks and artificial vision, as well as underactuated robotic hands.

“One of the main new developments of this research is that we united two major areas of Robotics: complex systems (such as robotic hands) and social interaction and communication,” explains Juan Víctores, one of the researchers from the Robotics Lab in the Department of Systems Engineering and Automation of the UC3M.

The first thing the scientists did as part of their research was to indicate, through a simulation, the specific position of each phalanx in order to depict particular signs from the Spanish Sign Language. They then attempted to reproduce this position with the robotic hand, trying to make the movements similar to those a human hand could make. “The objective is for them to be similar and, above all, natural. Various types of neural networks were tested to model this adaptation, and this allowed us to choose the one that could perform the gestures in a way that is comprehensible to people who communicate with sign language,” the researchers explain.

You might also like

A womn in a wheelchair using a computer A womn in a wheelchair using a computer

How technology advances accessibility for people with disabilities

In today’s fast-evolving technological setting, the impact of technological progress…

Sign Language Sign Language

How AI can help map sign languages

Like spoken languages, sign languages evolve organically and do not…

kid infront of computer screen kid infront of computer screen

UNMC’s Munroe-Meyer Institute introduces autism diagnostic tool

The UNMC Munroe-Meyer Institute is piloting a new diagnostic tool…

Hussein Alawieh, a graduate student in Dr. José del R. Millán's lab, wears a cap packed with electrodes that is hooked up to a computer. The electrodes gather data by measuring electrical signals from the brain, and the decoder interprets that information and translates it into game action. Hussein Alawieh, a graduate student in Dr. José del R. Millán's lab, wears a cap packed with electrodes that is hooked up to a computer. The electrodes gather data by measuring electrical signals from the brain, and the decoder interprets that information and translates it into game action.

Universal brain-computer interface enables thought-controlled gaming

Imagine playing a racing game like Mario Kart, using only…