“This paper presents methods for correlating a human performer and a synthetic accompaniment based on Implicit Relevance Feedback (IRF) using Graugaard’s expanded model for interactive music (Graugaard 2006c). The research is the result of experience with practical work with interactive music systems developed 2004-06 for a body of commissioned works and is based on human perception of music as an expressive artform where musically significant data may be present not only in the audio signal but also in human gestures and in physiological data. The relevance and feasibility of including expression and emotion as a high-level signal processing means for bridging man and machine is discussed. The resulting model is multi-level (physical, sensorial, perceptual, formal, expressive) and multi-modal (sound, human gesture, physiological), which makes it applicable to purely musical contexts, as well as intermodal contexts where music is combined with visual and/or physiological data.”
In Proceedings of the Information Interaction in Context (IIiX) symposium. Copenhagen, Denmark, October 18-20 2006. ISBN: 1-59593-482-0.