Top of page
Accessibility

New Voice Assistant Makes Web Browsing Easier for Blind People

blind woman using computer

Researchers have developed a new voice assistant that allows people with vision disabilities to get web content as quickly and as effortlessly as possible from smart speakers and similar devices.

The tool is called Voice Exploration, Retrieval, and Search (VERSE). The study was led by Alexandra Vtyurina, a University of Waterloo Faculty of Mathematics student, and Leah Findlater, an assistant professor from the University of Washington.

“People with vision disabilities often rely on screen readers, and increasingly voice-based virtual assistants, when interacting with computer systems,” said Vtyurina, a PhD candidate in Waterloo’s David R. Cheriton School of Computer Science, who undertook the study during her internship at Microsoft Research. “Virtual assistants are convenient and accessible but lack the ability to deeply engage with content, such as read beyond the first few sentences of an article, list alternative search results and suggestions. In contrast, screen readers allow for deep engagement with accessible content, and provide fine-grained navigation and control, but at the cost of reduced walk-up-and-use convenience.

The primary input method for VERSE is voice; so, users can say “next”, “previous”, “go back” or “go forward”. VERSE can also be paired with an app, which runs on a smartphone or a smartwatch. These devices can serve as input accelerators, similar to keyboard shortcuts. For example, rotating the crown on a smartwatch advances VERSE to the next search result, section, or paragraph, depending on the navigation mode.

You might also like

A womn in a wheelchair using a computer A womn in a wheelchair using a computer

How technology advances accessibility for people with disabilities

In today’s fast-evolving technological setting, the impact of technological progress…

Sign Language Sign Language

How AI can help map sign languages

Like spoken languages, sign languages evolve organically and do not…

kid infront of computer screen kid infront of computer screen

UNMC’s Munroe-Meyer Institute introduces autism diagnostic tool

The UNMC Munroe-Meyer Institute is piloting a new diagnostic tool…

Hussein Alawieh, a graduate student in Dr. José del R. Millán's lab, wears a cap packed with electrodes that is hooked up to a computer. The electrodes gather data by measuring electrical signals from the brain, and the decoder interprets that information and translates it into game action. Hussein Alawieh, a graduate student in Dr. José del R. Millán's lab, wears a cap packed with electrodes that is hooked up to a computer. The electrodes gather data by measuring electrical signals from the brain, and the decoder interprets that information and translates it into game action.

Universal brain-computer interface enables thought-controlled gaming

Imagine playing a racing game like Mario Kart, using only…