Immediately after visiting the Allosphere, a project called "Out of the Ether" was introduced by Dr. John Thompson, Prof. JoAnn Kuchera-Morin and their colleagues. Their approach is to use various sensors for natural control of computer-generated musical material in the context of live performances. The group showed me a video in which a flutist was able to direct a computer-based system. The analysis system consist of components for computer vision, audio analysis, and sensor analysis. The computer vision component tracks the position of the flute, the position of the flutist's head, and the direction in which the flutist is looking (gaze tracking).
We discussed, among other things, how music and language resemble and differ from each other and how methods such as the self-organizing map could be used in the context of the project.