Diemo by Shun Kambe SHN_5967


We are very happy to welcome Diemo Schwarz as the next guest of our Distinguished Talks series. This event ist the 7th issue  at the Institute for Music Informatics and Musicology of the Hochschule für Musik, Karlsruhe.

In this one-day presentation/workshop, Diemo Schwarz will discuss corpus-based concatenative synthesis (CBCS), a recent sound synthesis method based on audio descriptor analysis, and synthesis by interactive selection of segments from a sound database matching the desired audio characteristics.

In the presentation, Diemo is going to talk about the current use of corpus-based concatenate synthesis (CBCS) in composition and gestural control for creating of augmented instruments, or to transcribe and re-orchestrate environmental sound. For live improvisation, CBCS can be used for recording a corpus from live instruments to create a symbiotic relationship between  performers, creating a sound-based coupling between them, in addition to improvisation based on abstract musical ideas.

In the second part of the event workshop participants will gain hands-on experience using the MUBU and CataRT extensions to Cycling74’s Max environment. We will experiment and build instruments using sensors, microphones and cameras brought by workshop participants. Some related teaser videos below. Looking very much forward to this event.

Use of CataRT for texture synthesis

CataRT controlled by piezo microphones




Diemo Schwarz is a researcher– developer in real-time applications of computers to music with the aim of improving musical interaction, notably sound analysis–synthesis, and interactive corpus based concatenative synthesis.

Since 1997 at IRCAM (Institut de Recherche et Coordination Acous- tique – Musique) in Paris, he combined his studies of computer science and computational linguist- ics at the University of Stuttgart, Germany, with his interest in music, being an active performer and musician on drums and laptop, either solo, or in various collabo- rations with other improvising musicians.

He holds a PhD in computer science applied to music from the University of Paris, awarded in 2004 for the development of a new method of concatenative musical sound synthesis by unit selection from a large database. This work is continued in the CataRT application for real-time interactive corpus-based concatenative syn- thesis within Ircam’s Sound Music Movement Interaction team (ISMM).

His current research comprises uses of tangible interfaces for multi-modal interaction, generative audio and sound textures for video games, virtual and augmented reality, and the creative industries.

In 2017 he was Edgar-Varèse guest professor for computer music at the TU Berlin.


Distinguished Talk (14:00 – 15:00)

Interacting with a Corpus of Sounds

In this talk, we will look at the current use of CBCS in composition and gestural control of the navigation through the sound space, where each combination of input device and synthesis mode redefines the affordances of the interaction and thus a new digital musical instrument. When CBCS is controlled by descriptors analysed from audio input, it can be used to transform sound in surprising ways, to create augmented instruments, or to transcribe and re-orchestrate environmental sound. This special case of CBCS is commonly called “audio mosaicing”. For live performance, especially in an improvisation between an instrumental and a CBCS performer, recording the corpus live from the instrument creates a symbiotic relationship between the two performers, and creates a stronger and more direct coupling between them, compared to traditional improvisation where abstract musical ideas are exchanged.

CBCS also found a very promising application in environmental sound texture synthesis for audio–visual production in cinema and games, and sound installations such as the Dirty Tangible Interfaces (DIRTI), that opens up discovery and interaction via tangible everyday objects with rich sound corpora to the general public and children.


Workshop (15:15 – 18:00)

Building Instruments and Sound Design/Composition Systems using CBCS

In this workshop we will explore practical musical and artistic applications of CBCS based on the MuBu and CataRT extensions for Max from Ircam’s ISMM team. These applications can be expressive digital musical instruments (DMIs), interactive sound installations, compositional or sound design systems, using as input existing controllers or sensors brought by attendants (touch, movement, distance sensors), piezo or contact microphones on arbitrary surfaces and objects, or cameras.