Human navigation without vision: seeing with our ears and tongues through advances in sensory substitution and augmented reality
Speaker: Dr Michael J. Proulx
Vide of the talk here
Our senses transmit information about the external world into electrochemical signals that the brain uses to allow us to determine what is where around us. Humans most often rely on vision for rapidly assessing the environment. When vision is obstructed or if a person has a visual impairment, then the other senses must compensate, and this is where technology can help by creating auditory or tactile displays that represent images. Here I will explore advances in sensory substitution and augmented reality, and how "seeing" with auditory and tactile displays can enable object recognition, localisation, and navigation.
Biography
Dr Michael J. Proulx is Reader in Psychology and Director of the Crossmodal Cognition Lab at the University of Bath. He is also Co-Director of the REVEAL Research Centre (REal & Virtual Environments Augmentation Labs), Co-Investigator for CAMERA 2.0 (Center for the Analysis of Motion, Entertainment Research and Applications), and a doctoral supervisor in the Centre for Digital Entertainment and ART-AI (Accessible, Responsible, and Transparent Artificial Intelligence) in the Department of Computer Science. He is a Fellow of the Society for Experimental Psychology and Cognitive Science of the American Psychological Association and Fellow of the Royal Institute of Navigation. He is currently on a two-year sabbatical at Facebook Reality Labs Research in Redmond, Washington. Twitter: @MichaelProulx
https://researchportal.bath.ac.uk/en/persons/michael-proulx