Implicit Relevance Feedback in Interactive Music: Issues, Challenges, and Case Studies

Download PDF


“This paper presents methods for correlating a human performer and a synthetic accompaniment based on Implicit Relevance Feedback (IRF) using Graugaard’s expanded model for interactive music (Graugaard 2006c). The research is the result of experience with practical work with interactive music systems developed 2004-06 for a body of commissioned works and is based on human perception of music as an expressive artform where musically significant data may be present not only in the audio signal but also in human gestures and in physiological data. The relevance and feasibility of including expression and emotion as a high-level signal processing means for bridging man and machine is discussed. The resulting model is multi-level (physical, sensorial, perceptual, formal, expressive) and multi-modal (sound, human gesture, physiological), which makes it applicable to purely musical contexts, as well as intermodal contexts where music is combined with visual and/or physiological data.”

In Proceedings of the Information Interaction in Context (IIiX) symposium. Copenhagen, Denmark, October 18-20 2006. ISBN: 1-59593-482-0.

My main areas of research is advanced music composition, real-time music technology and music cognition. I am particularly interested in the relationship between core emotions and score notation and performance features, as well as novel methods for generative real-time performance that are founded in non-expert music cognition.

Some years ago I was involved in the Nordic SUM project – Systematic Understanding of Music – and a large part of the project concerned the implementation into computer code of concepts from probabilistic melody generation for real-time performance.

This work has since found its way into several commercial releases and is today a constant presence in my live laptop performances, either in duo or small group formats with a variety of instrumentalists or as a solo performer.