Spat (or Spatialisateur in French) is a real-time spatial audio processor that allows composers, sound artists, performers, and sound engineers to control the localization of sound sources in 3D auditory spaces. In addition, Spat provides a powerful reverberation engine that can be applied to real and virtual auditory spaces.
The processor receives sounds from instrumental or synthetic sources, adds spatialization effects in real-time, and outputs signals for reproduction on an electroacoustic system (loudspeakers or headphones).
Its modular signal processing architecture and design are guided by computational efficiency and configurability considerations. This allows, in particular, straightforward adaptation to various multichannel output formats and reproduction setups, over loudspeakers or headphones, while the control interface provides direct access to perceptually relevant parameters for specifying distance and reverberation effects, irrespective of the chosen reproduction format.
Another original feature of Spat is its room effect control interface relying on perceptive criteria. This allows the user to intuitively specify the characteristics of a specific room without having to use an acoustic or architectural vocabulary.
Spat relies on an efficient signal-processing library programmed in C++ to provide state-of-the-art technologies.The software bundle comes as a set of Max/MSP external objects (i.e. plugins that can be inserted into the Max environment). The bundle contains more than 250 external objects, many abstraction patchers, help patches, tutorials, large HRTF database, etc. Spat objects are highly configurable and most of them support up to 8192 input/output channels.
- sound spatialization (panning) in 2D or 3D. The supported panning algorithms include:
stereo (AB, XY, MS), binaural (with near field compensation) and transaural, vector-base amplitude panning (VBAP), vector-base intensity panning (VBIP), distance-based amplitude panning (DBAP), nearest-neighbor amplitude panning (KNN), speaker-placement correction amplitude panning (SPCAP), B-format and higher order Ambisonics (HOA) without order restrictions, near-field compensated higher order Ambisonics (NFC-HOA), wavefield synthesis (WFS), layer based amplitude panning (LBAP), etc.
- artificial reverberation. Multichannel scalable/tunable algorithmic reverberation based on feedback delay networks. Efficient multichannel real-time convolution without latency.
- perceptual control of the acoustic quality of the room: warmth and brilliance; presence/proximity of the sound source; room presence; early and late reverberation, heaviness and liveness. Easy control over the radiation of sound sources (aperture and orientation).
- low-level signal processing: equalization, Doppler effect, air absorption, etc.
- graphical user interfaces for controlling/authoring/monitoring the spatial sound scene.
- many objects to create/edit/transform spatial trajectories.
- many useful tools to manipulate multichannel audio signals: multichannel sound file player/recorder supporting large files (RF64); multichannel EQ, compressor, limiter, gate, etc.
- utility tools for linear time code (SMPTE), quaternions, etc.
- tools for room acoustics and/or speakers calibration: delays/gains measurement and correction; measurement, analysis and denoising of multichannel room impulse responses, etc.
- headphone monitoring of any multichannel stream.
- various audio effects: stereo enlargement, Leslie cabinet simulation, ping pong delays, graphical equalizers, parametric equalizers, etc.
- OSC remote control of all processors.
- full-fledged mixing environment (Panoramix) for 3D audio.
- Import/export/realtime rendering of object-based audio according to the Audio Definition Model (ADM format).
Available on MacOS and Windows.
Fields of Application
- Film & Video
- Scientific Research & Development
- Virtual and augmented reality
- Sound Design
Spat is used in a veriety of contexts:
- Live diffusion of sounds for concerts, sound installation and spatialization in real-time. Composers can associate each note or sound event in the score with a room effect or a specific position in space. Spat can be controlled by a sequencer, a score-following system, or any other algorithmic approach. Being integrated into the Max environment, Spat can easily be linked to any remote control device (tracking system, tablet, smartphone, joystick, gestural sensors, etc.).
- Mixing and Post-production. A spatialization module can be affected to each channel on the mixing table in order to have access to an intuitive and global control of the positions of each source and their associated room effect.
- Virtual Reality. The spatialized auditory component is essential in creating the sensations of presence and immersion in virtual reality applications, or in interactive installations. In such scenario, the binaural mode (3D reproduction over headphones) of Spat is particularly well suited. Binaural rendering is remarkably convincing when the processor is linked either to a tracking system (that follow the subject’s position and orientation) or to gestural controllers.
- T. Carpentier, M. Noisternig, O. Warusfel. Twenty years of Ircam Spat : looking back, looking forward. In Proc of 41st International Computer Music Conference (ICMC), Denton, TX, USA, pp 270 – 277, Sept 2015.
- T. Carpentier. Récents développements du Spatialisateur. In Proc of Journées d’Informatique Musicale (JIM), Montréal, May 2015.
- T. Carpentier. Une nouvelle implémentation du Spatialisateur dans Max. In Proc Journées d’Informatique Musicale (JIM), May 2018, Amiens, France. 2018.
- T. Carpentier. A new implementation of Spat in Max. In Proc of the 15th Sound & Music Computing Conference (SMC), pp 184 – 191, Limassol, Cyprus, July 2018