ABSTRACT

This chapter describes a consistent set of temporal models that have developed over the years for analyzing movement in real-time musical interaction. These models are probabilistic and can be unified and generalized under the formalism of dynamic Bayesian networks (DBNs). This unified approach offers new perspectives for embodied music cognition and interaction, both in terms of fundamental studies and technological development. The chapter aims to determine in real time the characteristics of the physical movement that can be used for two purposes: performance analysis and/or control parameters in interactive music systems. It proposes the general framework of DBNs that allows for modeling at various temporal scales and handles the intrinsic variability of the measured movement features. The framework introduced is "Modeling strategy". In addition, the probabilistic nature of DBNs allows for modeling of spatiotemporal variations inherent to human motion.