Grab-and-play

Hugo Scurto, Rebecca Fiebrink, 2016
software, Java, supervised learning

A Wekinator extension for rapid prototyping of motion-sound mappings. A person demonstrates how they might move using their input device. Supervised learning then generates a diversity of sonic mappings that matches the person’s motion range. Grab-and-play contributed to the Sound Control action research project, as well as to a music performance of yug.

Designed in collaboration with a music therapist.
Year
2016
Credits
The project was developed with Rebecca Fiebrink in collaboration with the Department of Computing of Goldsmiths University of London, in the context of the pre-doctoral research program of ENS Paris-Saclay.
Publications
Paper at ICMC (2016)
Pre-doctoral report (2016)
Events
Outreach @ BBC Radio 1 Academy, Exeter Phoenix, Exeter, UK (May.17.2016)
Code
GitHub

This slideshow requires JavaScript.