Sound Control is a research-action project that investigated how interactive machine learning could support the practice of music therapists working with disabled children.
The Sound Control project took place within a community music centre’s Musical Inclusion programme. Music therapists and educators were interested in more flexibly customise digital instruments for disadvantaged children they worked with—including, but not limited to, children with physical and learning disabilities.
Our team led participatory design with Musical Inclusion programme personnel and other music therapists and educators from the local community. In later workshops, we taught participants to use prototype technologies developed for the project, such as Grab-and-play, then elicited feedback about them. Customisation was deemed useful by practitioners to help children recognising and exercising agency in their environment, while encouraging moving, listening, and social aims. The technologies continue to be used by therapists and teachers in their workshops, while also enabling children to take part in public music performances.
Year
2016—2019
Links/Credits
Sound Control website
The project was developed with Samuel Thompson Parke-Wolfe, Rebecca Fiebrink, and Simon Steptoe, in collaboration with the Department of Computing of Goldsmiths University of London, in the context of the action research project Sound Control of NMPAT, and the pre-doctoral research program of ENS Paris-Saclay.
Publications
Paper at NIME (2019)
Pre-doctoral report (2016)
Paper at ICMC (2016)