Top of page
Technology

Research shows pre-verbal Infants understand sign language

Research shows pre-verbal Infants understand sign language

Rochester Institute of Technology’s National Technical Institute for the Deaf studied eye-tracking in correlation to infants learning.

NTID researcher and Assistant Professor Rain Bosworth and alumnus Adam Stone studied early-language knowledge in young infants and children by recording their gaze patterns as they watched a signer. The goal was to learn, just from gaze patterns alone, whether the child was from a family that used spoken language or signed language at home.

They tested two groups of hearing infants and children that differ in their home language. One “control” group had hearing parents who spoke English and never used sign language or baby signs. The other group had deaf parents who only used American Sign Language at home. Both sets of children had normal hearing in this study. The control group saw sign language for the first time in the lab, while the native signing group was familiar with sign language.

The study, published in Developmental Science, showed that the non-signing infants and children looked at areas on the signer called “signing space,” in front of the torso. The hands predominantly fall in this area about 80 percent of the time when signing. However, the signing infants and children looked primarily at the face, barely looking at the hands.

According to the findings, the expert sign-watching behavior is already present by about 5 months of age.

“This is the earliest evidence, that we know of, for effects of sign-language exposure,” said Bosworth. “At first, it does seem counter-intuitive that the non-signers are looking at the hands and signers are not. We think signers keep their gaze on the face because they are relying on highly developed and efficient peripheral vision. Infants who are not familiar with sign language look at the hands in signing space perhaps because that is what is perceptually salient to them.”

Another possible reason why signing babies keep their gaze on the face could be because they already understand that the face is very important for social interactions, added Bosworth.

“We think the reason perceptual gaze control matures so rapidly is because it supports later language learning, which is more gradual,” Bosworth said. “In other words, you have to be able to know where to look before you learn the language signal.”

Bosworth says more research is needed to understand the gaze behaviors of deaf babies who are or are not exposed to sign language.

You might also like

A womn in a wheelchair using a computer A womn in a wheelchair using a computer

How technology advances accessibility for people with disabilities

In today’s fast-evolving technological setting, the impact of technological progress…

Sign Language Sign Language

How AI can help map sign languages

Like spoken languages, sign languages evolve organically and do not…

kid infront of computer screen kid infront of computer screen

UNMC’s Munroe-Meyer Institute introduces autism diagnostic tool

The UNMC Munroe-Meyer Institute is piloting a new diagnostic tool…

Hussein Alawieh, a graduate student in Dr. José del R. Millán's lab, wears a cap packed with electrodes that is hooked up to a computer. The electrodes gather data by measuring electrical signals from the brain, and the decoder interprets that information and translates it into game action. Hussein Alawieh, a graduate student in Dr. José del R. Millán's lab, wears a cap packed with electrodes that is hooked up to a computer. The electrodes gather data by measuring electrical signals from the brain, and the decoder interprets that information and translates it into game action.

Universal brain-computer interface enables thought-controlled gaming

Imagine playing a racing game like Mario Kart, using only…