Top of page
Technology

Student designs robotic guide dog for people with vision disabilities

Theia - a portable and concealable handheld device that guides users through outdoor environments with little user input.

A student from Loughborough University has designed an autonomous way-finding device for people with vision disabilities who are unable to home or have a service animal.

Recreating the role of a guide dog alongside programming quick and safe routes to destinations using real time data, the invention could be the future for people with vision disabilities.

Anthony Camu, a final year Industrial Design and Technology student, wanted to design a product that replicates a guide dog’s functions for visually impaired people that fall into the latter category.

Inspired by virtual reality gaming consoles, he has conceptualised and started to prototype ‘Theia’, a portable and concealable handheld device that guides users through outdoor environments and large indoor spaces with very little user input.

In essence, it’s handheld robotic guide dog – minus the waggy tail.

Inspired by autonomous vehicles, Theia aims to translate that sense of effortless driving into a system of effortless walking, helping users make complex manoeuvres without needing to see nor think.

Anthony has successfully created prototypes that feature the CMG technology.

The prototypes were used to experiment with momentum to manipulate the movement of one’s hand.

Although the project is in its infancy and has issues such as excessive vibration and breaking motors, the potential is there.

Anthony is hoping to build on his design and produce more prototypes by working with design engineers and programmers – perhaps even founding a start-up company and launching a crowdfunding campaign.

You might also like

A womn in a wheelchair using a computer A womn in a wheelchair using a computer

How technology advances accessibility for people with disabilities

In today’s fast-evolving technological setting, the impact of technological progress…

Sign Language Sign Language

How AI can help map sign languages

Like spoken languages, sign languages evolve organically and do not…

kid infront of computer screen kid infront of computer screen

UNMC’s Munroe-Meyer Institute introduces autism diagnostic tool

The UNMC Munroe-Meyer Institute is piloting a new diagnostic tool…

Hussein Alawieh, a graduate student in Dr. José del R. Millán's lab, wears a cap packed with electrodes that is hooked up to a computer. The electrodes gather data by measuring electrical signals from the brain, and the decoder interprets that information and translates it into game action. Hussein Alawieh, a graduate student in Dr. José del R. Millán's lab, wears a cap packed with electrodes that is hooked up to a computer. The electrodes gather data by measuring electrical signals from the brain, and the decoder interprets that information and translates it into game action.

Universal brain-computer interface enables thought-controlled gaming

Imagine playing a racing game like Mario Kart, using only…