somasticks

somasticks are augmented drumsticks that use unsupervised learning to emphasize the somatic side of drumming practice.

Contrary to standard drumsticks, somasticks do not need to hit any material to produce sound, but rather leverages on embodied listening to drive music performance. Specifically, they may be continuously waved in the air to trigger recorded drum sounds, in an expressive workflow that lets the performer explore various motion qualities in reaction to the internal bodily sensations produced by the sounds.

somasticks combine an unsupervised learning prototype with hardware elements. We used real drumsticks to create gestural affordances that are naturally related to drumming practice. We then used an Online Gaussian Mixture Model to create a dynamic mapping between drumming motion data and drum sounds.

The current prototype uses R-IoT wireless accelerometers, to which we connected two interruptors and a strain gauge for additionnal controls over machine learning. The firmware is coded in C++ using the Energia platform; data is processed in Max/MSP using the MuBu toolbox, with wavelet analysis for motion data preprocessing, and concatenative synthesis for sound.

somasticks are currently practiced within the Daim™ art project.

Year
2018
Credits
The project was developed with Frédéric Bevilacqua, Jules Françoise, Riccardo Borghesi, Djellal Chalabi and Emmanuel Flety in collaboration with the ISMM group of IRCAM, and School of Interactive Arts and Technology (SFU), in the context of a PhD thesis at Sorbonne Université.
Publications
Paper at DIS (2021)
Paper at NIME (2017)
Events
movA workshop @ Stereolux, Nantes, Fr (Apr.2019)
Code
GitHub

This slideshow requires JavaScript.