somasticks are augmented drumsticks that use unsupervised learning to emphasize the somatic side of drumming practice.
Contrary to standard drumsticks, somasticks do not need to hit any physical objects to produce sound, but rather leverages on embodied listening to drive musical performance. Specifically, they may be continuously waved in the air to trigger recorded drum sounds, in an expressive workflow where the performer explores various playing modes in reaction to the internal bodily sensations produced by the sounds.
somasticks combine an unsupervised learning model prototype with hardware elements. We used real drumsticks to create gestural affordances that are naturally related to drumming practice. We embedded the sticks with wireless sensors to feed an Online Gaussian Mixture Model with drumming motion data. We finally leveraged the online behaviour of the unsupervised learning model to design interactive drumming sound processes. somasticks are currently practiced within the daim art project.
The current prototype equips standard drumsticks with R-IoT embedded systems, to which we connected two interruptors and a strain gauge for additionnal controls over machine learning techniques. The firmware is coded in C++ using the Energia platform; data is processed in Max/MSP using the MuBu toolbox, with wavelet analysis and online gaussian mixture models for motion, and concatenative synthesis for sound.
The project was developed with Frédéric Bevilacqua, Jules Françoise, Riccardo Borghesi, Djellal Chalabi and Emmanuel Flety in collaboration with the ISMM group of IRCAM and School of Interactive Arts and Technology (SFU), in the context of the Sorbonne Université Doctorate in Computer Science.
Available on GitHub