Les Mercredis de STMS welcome Nicolas Souchal, doctoral student in the laboratory's Analysis of Musical Practices team (APM) and Diemo Schwarz of the Sound–Movement–Music Interaction team (ISMM), who will present their work entitled Audio–Visual Augmentation of the Trumpet: Navigating Control and Unpredictability in a Long Term Research--Creation Collaboration.
To follow the conference remotely, here's a zoom link:
https://us06web.zoom.us/j/89611100449?pwd=6vV4nNKwX83wEEdaDYVlSbNFffAwdd.1
ID de réunion: 896 1110 0449- Code secret: 850132
Abstract :
First, we will describe an audio-augmented trumpet that mainly uses the sound of the instrument itself to control sound processing, unlike typical sensor-based augmented instruments. Using real-time sound analysis driving sound processing such as additive synthesis, resonators, and auto-convolution, we explore human/augmented-instrument relationships that introduce unpredictability, navigating between moments of control and moments of adaptation to situations of non-control, which is particularly relevant to the practice of improvisation. During this long-term research--creation collaboration, we developed the most musically relevant analysis-to-synthesis mappings. Adopting a practice-based approach, grounded in auto-ethnographic principles, enabled the adaptation and refinement of technological development driven by musical needs, extending beyond simple audio effects into a realm where the acoustic environment becomes less predictable and more of a partner in improvisation.
We will then present and perform Extense, including this augmented trumpet, adding video augmentation controlled by the trumpet sound browsing through a corpus of close-up images of drawings by Elizabeth Saint-Jalmes. To this end, the principle of interactive corpus-based concatenative synthesis—until now most commonly applied to sound—is extended to the visual domain. Instead of creating music by navigating through sound grains in a space of audio descriptors, a corpus of still images has been constructed. By computing image descriptors (such as color, texture, luminosity, entropy, and others), this corpus can be navigated interactively, with control enabled either through gesture recognition using motion sensors or through audio analysis of trumpet performance. One of the key challenges of this performative system is to explore ways, within the field of image processing, to establish meaningful links between image descriptors and sound descriptors.
The visual corpus is drawn from the work of Elizabeth Saint-Jalmes, specifically from the series Révolution, Sans titre, Afflux, and Orgasmes crus. The topologies that emerge in these works resonate with those present in the audio-augmented trumpet. To construct this corpus, over one hundred high-resolution photographs and scans of the artist’s works were produced, followed by the acquisition and digitization of these images.
Biographies:
Nicolas Souchal-trumpet
After an M1 focusing on the approaches of improvisers George Lewis, Evan Parker and Jean-Luc Cappozzo, followed by an M2 entitled “Memories, images and the colonial fact in jazz and improvised music in France”, Nicolas Souchal is currently preparing PhD under the co-direction of Alexandre Pierrepont, Musidanse laboratory, Université Paris 8 | Vincennes - Saint-Denis, and of Clément Canonne, team “Analyse des Pratiques Musicales”, IRCAM, Paris. Entitled “Pertes de contrôle dans des processus improvisés : les éprouver, les analyser, les provoquer, les exploiter” (Loss of control in improvised processes: testing, analyzing, provoking and exploiting them), his research-creation thesis project focuses on two levels: that of the augmented instrument and that of collective improvised musical practice.
A trumpeter in the field of contemporary jazz and improvised music, he is co-founder of projects combining art and research. Releases and projects available on his personal website: http://www.nicolassouchal.fr/
Diemo Schwarz - music and video computing, interaction design
Diemo Schwarz is a researcher at Ircam (STMS laboratory), electronic music composer and improvising musician. His scientific research focuses on the interaction between musician and machine, on the exploitation of large masses of sound for real-time and interactive sound synthesis, either to give musicians immediate and expressive access to electronic sound worlds through gesture control, or for consumer installations with intuitive tangible interfaces, such as DIRTI (Dirty Tangible Interfaces). In 2017, he was a visiting professor at the Technical University of Berlin as part of the DAAD's Edgar-Varèse programme, and in 2022, a resident at the IMéRA Institute for Advanced Research, Aix-Marseille University, as part of the arts, sciences, societies programme.