In the field of sound reproduction and communication, future audio technology will attempt to shift emphasis towards sensations of immersion and presence. Such notions are intimately linked to the spatial dimensions of a multimedia scene, and particularly in the field of sound, and are intensified in situations involving the participation of the listener. This participation may involve navigation within a scene or the gestural interaction of objects within it. Under these conditions, made possible by binaural and holophonic technology, the congruence and real-time updating of auditory spatial clues in accordance with the listener’s movement or actions, have a major impact on the sensation of presence. Such a context led to the development of a set of experiments focusing on auditory spatial cognition, notably via the study of multi-sensorial integration processes, focusing on auditory and idiothetical modalities (clues induced by the subject’s movements including balance and proprioception).
Experimental methods call on behavioral experiences based on the observation of a subject’s performance, in terms of localization and navigation, when submitted to different exploratory contexts. We are also interested in the relationships between multisensory integration and emotions. We may study, for example, the effects of spatial conflicts between sound and vision on the emotional reaction of subjects, or assess the perception of numerosity (e.g. the quantification of a crowd) based on a sensory mode and its connection to emotion.
IRCAM's Team: Acoustic and Cognitive Spaces team.