Sound Control is a research-action project that investigated how interactive machine learning could support the practice of music therapists working with disabled children.
The project took place within a community music centre programme called Musical Inclusion. Music therapists and educators were interested in more flexibly customise digital instruments for disadvantaged children they worked with—including, but not limited to, children with physical and learning disabilities.
Our team led participatory design with music therapists and other educators from the local community. In later workshops, we taught participants to use prototype technologies developed for the project, such as Grab-and-play, then elicited feedback about them. Customisation was deemed useful by practitioners to help children recognising and exercising agency in their environment, while encouraging moving, listening, and social aims. The technologies continue to be used by therapists and teachers in their workshops, while also enabling children to take part in public music performances.