JamSketch Eye: An Eye-tracking-based Improvisation System for Disabled People by Tetsuro Kitahara, Yasuyuki Saito, Sergio Giraldo & Rafael Ramirez
Improvisation is an exciting way to enjoy music, but it is difficult for people with a severe motor disability. In this paper, we propose a system that enables disabled people to play improvisation using only gaze control. We have two issues when developing this system. The first issue is the type of data the user should input. Because our target users can control only their gaze, the data they can input are limited. We use a melodic outline, in which only the macro structure of the melody intended by the user is represented. Users can input melodic outlines easily using gaze control with a commercial eye tracker. The second issue is how to generate a melody from a given melodic outline. We use our melody generation algorithm based on a genetic algorithm, as proposed in our previous paper. Through preliminary tests, we confirmed that users could improvise by drawing melodic outlines with their gaze.
I'm developing music systems with which non-musicians enjoy music through performance and/or creation. This year, I presented a demo of an improvisation system using eye tracking.