You can manage bookmarks using lists, please log in to your user account for this.
Media type:
E-Article
Title:
Embodying Spatial Sound Synthesis with AI in Two Compositions for
Instruments and 3-D Electronics
Contributor:
Einbond, Aaron;
Carpentier, Thibaut;
Schwarz, Diemo;
Bresson, Jean
imprint:
MIT Press, 2022
Published in:Computer Music Journal
Language:
English
DOI:
10.1162/comj_a_00664
ISSN:
0148-9267;
1531-5169
Origination:
Footnote:
Description:
<jats:title>Abstract</jats:title>
<jats:p>The situated spatial presence of musical instruments has been well studied in the fields of acoustics and music perception research, but so far it has not been the focus of human–AI interaction. We respond critically to this trend by seeking to reembody interactive electronics using data derived from natural acoustic phenomena. Two musical works, composed for human soloist and computer-generated live electronics, are intended to situate the listener in an immersive sonic environment in which real and virtual sources blend seamlessly. To do so, we experimented with two contrasting reproduction setups: a surrounding Ambisonic loudspeaker dome and a compact spherical loudspeaker array for radiation synthesis. A large database of measured radiation patterns of orchestral instruments served as a training set for machine learning models to control spatially rich 3-D patterns for electronic sounds. These are exploited during performance in response to live sounds captured with a spherical microphone array and used to train computer models of improvisation and to trigger corpus-based spatial synthesis. We show how AI techniques are useful to utilize complex, multidimensional, spatial data in the context of computer-assisted composition and human–computer interactive improvisation.</jats:p>