The Sound Music Movement Interaction team (previously known as the Real-Time Musical Interactions team) carries out research and development on interactive systems dedicated to music and performances. Our work relates to all aspects of the interactive process, including the capture and multimodal analysis of the gestures and sounds created by musicians, tools for the synchronization and management of interaction, as well as techniques for real-time synthesis and sound processing. These research projects and their associated computer developments are generally carried out within the framework of interdisciplinary projects that include scientists, artists, teachers, and designers and find applications in creative projects, music education, movement learning, or in digital audio industrial fields.
For more info, see http://ismm.ircam.fr
Major Themes
Modeling and Analysis of Sounds and Gestures
This theme covers the theoretical developments concerning the analysis of the sound and gesture flow, or more generally, multi-modal temporal morphologies. This research concerns diverse techniques for audio analysis, the study of the gestures of performing musicians or dancers.
Technologies for Multimodal Interaction
This theme concerns our tools for analysis and multimodal recogntion of movements and sound; tools for synchronization (gesture following, for example) and visualization.
Interactive Sound Synthesis and Processing
This focuses essentially on synthesis and sound processing methods based on recorded sounds or large sound bodies.
Systems for Gesture Capture and Augmented Instruments
This theme focuses on the developments the team has made in terms of gestural interfaces and augmented instruments for music and performances.
Specialist Areas
Interactivity, real-time computer science, human-computer interaction, signal processing, motion capture, modeling sound and gesture, statistical modeling and automatic learning, real-time sound analysis and synthesis.