improving accessibility of musical expression with machine learning

The Sound Control project investigated how interactive machine learning could support the practice of music therapists working with disabled children.

The Sound Control project was initiated by Rebecca Fiebrink, along with music educators and music therapists associated with a community music centre’s Musical Inclusion programme. Programme members were interested in more flexibly customise digital instruments for disadvantaged children they worked with—including but not limited to children with physical and learning disabilities.

Our team led participatory design with Musical Inclusion programme personnel and other music therapists and educators from the local community. In later workshops, we taught participants to use prototype technologies developed for the project, such as Grab-and-play, then elicited feedback about them. Customisation was deemed useful by practitioners to help children recognising and exercising agency in their environment, while encouraging moving, listening, and social aims. The technologies continue to be used the therapists and teachers in their workshops, and even enabled the children with disabilities to take part in public music performances.

Year
2016—2019

Credits
The project was developed with Samuel Thompson Parke-Wolfe and Rebecca Fiebrink in collaboration with the Department of Computing of Goldsmiths University of London and NMPAT, in the context of the ENS Paris-Saclay Pre-doctoral Research program.

Publications
Paper at NIME (2019)
Pre-doctoral report (2016)
Paper at ICMC (2016)

Ce diaporama nécessite JavaScript.