Skip to main content

Open Access Gesture Control of Sound Synthesis: Analysis and Classification of Percussion Gestures

In recent years, the control of virtual instruments or sound-synthesis processes by natural gestures has become an important research field, both for building new audio-visual tools and for exploring gesture-sound relationships. Such multimodal and interactive tools typically present two advantages, on the one hand they provide realistic virtual instruments whose response can be compared to existing musical instruments, and on the other hand they give the possibility to vary characteristics of natural gestures, while ensuring a certain coherence between gesture and sound parameters.

In this paper, we present and evaluate a new framework for explicitly expressing the characteristics of natural percussion gestures used for modeling, controlling and finally synthesizing new percussion gestures. A preliminary analysis of pre-recorded gestures leads to the identification and evaluation of significant parameters using a classification approach. This analysis shows that a reduced-dimension representation of captured motion can be used to control a virtual character. Furthermore, the simulated gestures provide dynamical variables that can be used to control sound synthesis through a mapping-interaction process.

Document Type: Research Article

Publication date: July 1, 2010

  • Access Key
  • Free content
  • Partial Free content
  • New content
  • Open access content
  • Partial Open access content
  • Subscribed content
  • Partial Subscribed content
  • Free trial content