somasticks are augmented drumsticks that use unsupervised learning to emphasize the somatic side of drumming practice.
Contrary to standard drumsticks, somasticks do not need to hit any material to produce sound, but rather leverages on embodied listening to drive music performance. Specifically, they may be continuously waved in the air to trigger recorded drum sounds, in an expressive workflow that lets the performer explore various motion qualities in reaction to the internal bodily sensations produced by the sounds.
somasticks combine an unsupervised learning model prototype with hardware elements. We used real drumsticks to create gestural affordances that are naturally related to drumming practice. We then used an Online Gaussian Mixture Model to create a dynamic mapping between drumming motion data and drum sounds.
The current prototype uses R-IoT wireless accelerometers, to which we connected two interruptors and a strain gauge for additionnal controls over machine learning techniques. The firmware is coded in C++ using the Energia platform; data is processed in Max/MSP using the MuBu toolbox, with wavelet analysis for motion data preprocessing, and concatenative synthesis for sound.
somasticks are currently practiced within the daim art project.
The project was developed with Frédéric Bevilacqua, Jules Françoise, Riccardo Borghesi, Djellal Chalabi and Emmanuel Flety in collaboration with the ISMM group of IRCAM and School of Interactive Arts and Technology (SFU), in the context of the Sorbonne Université Doctorate in Computer Science.
Paper at DIS (2021)
Paper at NIME (2017)
movA workshop @ Stereolux, Nantes, Fr (April 2019)