Somax 2 is an application for musical improvisation and composition. It is implemented in Max and is based on a generative model using a process similar to concatenative synthesis to provide stylistically coherent improvisation, while listening to and adapting to a musician (or any other type of audio or MIDI source) in real-time. The model is operating in the symbolic domain and is trained on a musical corpus, consisting of one or multiple MIDI files, from which it draws its material used for improvisation. The model can be used with little configuration to autonomously interact with a musician, but it also allows manual control of its generative process, effectively letting the model serve as an instrument that can be played in its own right.
While the application can be used straight out of the box with little configuration (see Getting Started), it is also designed as a library, allowing the user to create custom models as well as set up networks of multiple models and sources that are listening to and interacting with each other.
Somax 2 is a totally new version of the mythical Somax reactive co-improvisation paradigm designed in the ircam Music Representation team but never publicly released yet. Written in Max and Python, it features a modular multithreaded implementation, multiple wireless interacting players, new UI design with tutorials and documentation, as well as a number of new interaction parameters.
Get an initial taste of Somax 2 basic flavor by clicking on these 2 video examples. Go directly to Vimeo to read the explanations.
Somax 2 is provided with an interactive tutorial giving a brief introduction to the different modules of Somax and introducing the first steps towards interacting with the model.
Individual help files exists as well for each Max object, outlining how to use the object, its parametric controls and a number of use cases.
To install Somax 2
go to the Somax 2 Github repository and follow the Readme instructions.
To learn more about Somax 2
- Read a brief overview of how the Somax interaction model works
- Go to the Somax page for more technical content and access to publications and medias
To communicate with the team
to signal any issue or bug please contact Joakim Borg at Ircam. To communicate about ideas and projects write to Gerard Assayag and Joakim Borg at Ircam. Please let us know of your usages and extensions.
firstname [dot] secondname [at] ircam [dot] fr
The Somax 2 project is part of the ANR project MERCI (Mixed Musical Reality with Creative Instruments) ANR-19-CE33-0010, and the ERC project REACH (Raising Co-creativity in Cyber-Human Musicianship) ERC-ADG-19-883313. PI : Gérard Assayag, Music Representation Team, IRCAM STMS Lab (CNRS, Sorbonne University, Ministry of Culture).