This project investigated how interactive machine learning could support the practice of a music therapist working with disabled children.
Some children may have cognitive and/or physical disabilities that prevent them from playing the same musical instrument than the therapist, thus avoiding communication. Our wish was to test whether interactive machine learning could circumvent this, by allowing rapid prototyping of musical instruments adapted to children’s range of motion.
We led a series of studies in close link with a therapist and its working environment to identify their needs. Our fully-functioning « grab-and-play » software is now used autonomously by the therapist in individual sessions and public performances created along with disabled children.
The project was developed with Rebecca Fiebrink in collaboration with the Department of Computing of Goldsmiths University of London and NMPAT, in the context of the ENS Paris-Saclay Pre-doctoral Research program.