ABSTRACT

This chapter presents the idea that skilled motor control on a digital musical instrument (DMI) can be used to study complex auditory phenomena. It provides an overview of the HandSketch underlying design principles and describes hardware and software components as they have evolved nowadays. In 2005–2006, there was quite a lot of emulation in the design of a controller that aimed at manipulating the voice quality dimensions of the RealtimeCALM synthesizer. The vocal sounds produced with the HandSketch are synthesized by custom software called RAMCESS. The chapter aims at formalizing the pen-based interaction, essentially by solving ergonomic problems. It also presents the analysis-by-performance (AbP) methodology, taking advantage of the skilled behavior on the new interfaces for musical expression (NIME). The chapter discusses some specific aspects of human-computer interaction (HCI) encountered in the development of DMIs. Finally, it illustrates AbP in the modeling of the vibrato in singing.