The Sound Control project investigated how interactive machine learning could support the practice of music therapists working with disabled children.
The Sound Control project took place within a community music centre’s Musical Inclusion programme. Music therapists and educators were interested in more flexibly customise digital instruments for disadvantaged children they worked with—including, but not limited to, children with physical and learning disabilities.
Our team led participatory design with Musical Inclusion programme personnel and other music therapists and educators from the local community. In later workshops, we taught participants to use prototype technologies developed for the project, such as Grab-and-play, then elicited feedback about them. Customisation was deemed useful by practitioners to help children recognising and exercising agency in their environment, while encouraging moving, listening, and social aims. The technologies continue to be used the therapists and teachers in their workshops, even enabling children to take part in public music performances.
The project was developed with Samuel Thompson Parke-Wolfe, Rebecca Fiebrink, and Simon Steptoe, in collaboration with the Department of Computing of Goldsmiths University of London, in the context of the Sound Control action research project of NMPAT and the ENS Paris-Saclay Pre-doctoral Research program.