Vincent MARTIN from the team EAC, STMS Lab (UMR 9912- Ircam, Sorbonne Université, CNRS, Ministère de la culture) will defend in English his thesis on Thursday, March 3rd at 2PM at Ircam.
This defence can be seen on the link YouTube: https://youtu.be/JVz-PUjez8g or in the MEDIAS server : https://medias.ircam.fr/x36404d_soutenance-de-these-de-vincent-martin
The title is:
« Auditory distance perception of static sources in the context of audio-only augmented reality: an investigation of acoustic and non-acoustic cues »
Jury:
Norbert Kopčo (P.J. Šafárik University), Rapporteu
Nicolas Grimault (CNRS), Rapporteur
Quentin Grimal (Sorbonne-Université/INSERM), Examinateur
Etienne Hendrickx (Université de Brest), Examinateur
Isabelle Viaud-Delmon (STMS - UMR9912 - Ircam, Sorbonne Université, CNRS, Ministère de la culture), Directeur
Olivier Warusfel (STMS - UMR9912 - Ircam, Sorbonne Université, CNRS, Ministère de la culture), Co-Directeur
Abstract:
Audio-only Augmented Reality (AAR) refers to a set of technologies that aim to merge computer-generated auditory content into a user's acoustic environment. AAR systems have fundamental requirements as an audio playback system must enable a seamless integration of virtual sound events within the user's environment. This thesis aims to investigate a variety of effects linking the auditory distance perception of virtual sound sources to the context of (AAR) applications. It focuses on how its specific perceptual context and primary objectives impose constraints on the design of the distance rendering approach used to generate virtual sound sources for AAR applications.
Different challenges arise from these critical requirements. The first part of the thesis concerns the critical role of acoustic cue reproduction in the auditory distance perception of virtual sound sources in the context of audio-only augmented reality. Auditory distance perception is based on a range of cues categorized as acoustic, and cognitive. We examined which strategies for weighting auditory cues are used by the auditory system to create the perception of sound distance. By considering different spatial and temporal segmentations, we attempted to characterize how early energy is perceived in relation to reverberation.
The second part of the thesis focuses on how, in AAR applications, environment-related cues could impact the perception of virtual sound sources. In AAR applications, the geometry of the environment is not always completely considered. In particular, the calibration effect induced by the perception of the visual environment on the auditory perception is generally overlooked. We also became interested in the instance in which co-occurring real sound sources whose placements are unknown to the user could affect the auditory distance perception of virtual sound sources through an intra-modal calibration effect.