ABSTRACT

Movement-based interactive musical systems have been developed in artistic communities since the beginning of electronic music. This chapter presents a general methodology for designing musical interactive systems, using movement sensing and descriptor-based synthesis of recorded sound materials. Importantly, the design principles focus on action–sound metaphors that can be built upon features of recorded sound material and their possible relationships to human movement. The musical interactive systems can be seen as the fusion of several lines of research at Ircam from fundamental research on movement and interaction to technical developments on motion sensing and sound synthesis. The chapter explains several key conceptual elements that motivate our approach. It presents the general method and technical architecture and describes how this synthesis method can be used for implementing movement–sound relationships in concrete exemplary cases. The chapter ends with a discussion and a proposition for future challenges.