Top of page
Technology

Team programs a humanoid robot to communicate in sign language

For a robot to be able to “learn” sign language, it is necessary to combine different areas of engineering such as artificial intelligence, neural networks and artificial vision, as well as underactuated robotic hands.

“One of the main new developments of this research is that we united two major areas of Robotics: complex systems (such as robotic hands) and social interaction and communication,” explains Juan Víctores, one of the researchers from the Robotics Lab in the Department of Systems Engineering and Automation of the UC3M.

The first thing the scientists did as part of their research was to indicate, through a simulation, the specific position of each phalanx in order to depict particular signs from the Spanish Sign Language. They then attempted to reproduce this position with the robotic hand, trying to make the movements similar to those a human hand could make. “The objective is for them to be similar and, above all, natural. Various types of neural networks were tested to model this adaptation, and this allowed us to choose the one that could perform the gestures in a way that is comprehensible to people who communicate with sign language,” the researchers explain.

You might also like

person holding Blindshell Classic 2 person holding Blindshell Classic 2

Vodacom launches accessible smartphone for blind people

Vodacom Group hosted its second Disability and Accessibility Conference on…

A student demonstrates how a robotic exoskeleton allows for strong movement A student demonstrates how a robotic exoskeleton allows for strong movement

NAU unveils wearable robots to aid walking

Imagine a future in which people with disabilities can walk…

Mark and Mabel Ramos Mark and Mabel Ramos

Father develops software to improve skills therapy

Mabel Ramos’s favorite song is “Ghostbusters” by Ray Parker Junior.…

The 4D programmable and low-voltage haptic interface based on elastomer actuators. (A) (i) Concept of a flexible haptic interface based on the low-voltage-driven elastomer actuators for human–machine interaction. (ii) Structure of the actuator prototype, including a multilayer elastomer acting as a stiffness regulator, a charged electret film, 2 electrode layers, and an insulating layer, and the schematic illustration of the actuation mechanism. (B) Performance comparison across 6 dimensions with other reported haptic interfaces. (C) Haptic interface (i) integrated with the skin on a human arm for emotional Braille application and (ii) incorporated into a cane for blind users to facilitate multidirectional haptic navigation. (D) Overview of 4D haptic modulation principles for enhancing emotional and navigational haptic feedback. The 4D programmable and low-voltage haptic interface based on elastomer actuators. (A) (i) Concept of a flexible haptic interface based on the low-voltage-driven elastomer actuators for human–machine interaction. (ii) Structure of the actuator prototype, including a multilayer elastomer acting as a stiffness regulator, a charged electret film, 2 electrode layers, and an insulating layer, and the schematic illustration of the actuation mechanism. (B) Performance comparison across 6 dimensions with other reported haptic interfaces. (C) Haptic interface (i) integrated with the skin on a human arm for emotional Braille application and (ii) incorporated into a cane for blind users to facilitate multidirectional haptic navigation. (D) Overview of 4D haptic modulation principles for enhancing emotional and navigational haptic feedback.

Haptic tech boosts cognitive support for persons with disabilities

The advancements in tactile perception and feedback technologies have propelled…