Ian Reynolds
MIT EECS — Foxconn Undergraduate Research and Innovation Scholar
Echolocation: Sound and sonar as a mobility aid
2015–2016
Aude Oliva
In nature, echolocation is employed some animals to augment or replace visual input. Some blind humans have also learned to echolocate by producing tongue clicks; while this method has granted these individuals a leg up in mobility, its neurological underpinnings are still being ascertained. In 2015 Sohl-Dickstein et al. described a prototype “ultrasonic helmet” that acted as an echolocation aid by producing ultrasonic pulses, listening for their echo, and transforming the result into a sound audible to a human wearer. An improved version of this device in a wearable form factor has potential as a platform for further experimentation in this domain and for use in real-world tests of this new assistive technology paradigm, unlocking echolocation as a mobility aid for all who find it useful.
I took a SuperUROP because I wanted an intensive research project requiring continued commitment. This project was a great match for me because it builds on signal processing techniques I learned in 6.003 and gave me a chance to work on hardware design and embedded systems. Mostly, I was excited by the prospect of helping to develop a device that could empower blind people to navigate the world with more confidence.