In complex listening situations, speech understanding can be highly challenging due to multiple sound sources from multiple locations, background noise, and reverberation. Changes in head and body position or focus of eye gaze can provide additional cues to understanding or directing a conversation. For example, recognizable head and eye gestures can convey listener understanding prior to a talker completing a sentence or for a talker to emphasize a point. Such body and head movements also play a major role in the peripheral acoustic cues available to the listener, such as the binaural and spectral changes critical to sound source localization and segregation. Despite our comprehensive understanding of many of these peripheral cues in isolation, traditional laboratory settings often overlook the influences of head and body movements on auditory perception and speech communication in natural environments.
The scope of this research topic is intended to broadly capture the impact that head, body, and eye movements have on everyday communication, sound perception, and navigation. This collection of research is fundamental to understanding speech communication and auditory perception as it exists outside the laboratory, and knowledge in this area should progress innovation and development in hearing aids, virtual reality, and other clinical and consumer technology sectors that integrate across multiple sensory domains to provide ecologically centered user experiences.
In complex listening situations, speech understanding can be highly challenging due to multiple sound sources from multiple locations, background noise, and reverberation. Changes in head and body position or focus of eye gaze can provide additional cues to understanding or directing a conversation. For example, recognizable head and eye gestures can convey listener understanding prior to a talker completing a sentence or for a talker to emphasize a point. Such body and head movements also play a major role in the peripheral acoustic cues available to the listener, such as the binaural and spectral changes critical to sound source localization and segregation. Despite our comprehensive understanding of many of these peripheral cues in isolation, traditional laboratory settings often overlook the influences of head and body movements on auditory perception and speech communication in natural environments.
The scope of this research topic is intended to broadly capture the impact that head, body, and eye movements have on everyday communication, sound perception, and navigation. This collection of research is fundamental to understanding speech communication and auditory perception as it exists outside the laboratory, and knowledge in this area should progress innovation and development in hearing aids, virtual reality, and other clinical and consumer technology sectors that integrate across multiple sensory domains to provide ecologically centered user experiences.