Repository | Book | Chapter

(2014) Sound, music, and motion, Dordrecht, Springer.
Extracting commands from gestures
gesture spotting and recognition for real-time music performance
Jiuqiang Tang, Roger B. Dannenberg
pp. 72-85
Our work allows an interactive music system to spot and recognize "command" gestures from musicians in real time. The system gives the musician gestural control over sound and the flexibility to make distinct changes during the performance by interpreting gestures as discrete commands. We combine a gesture threshold model with a Dynamic Time Warping (DTW) algorithm for gesture spotting and classification. The following problems are addressed: i) how to recognize discrete commands embedded within continuous gestures, and ii) an automatic threshold and feature selection method based on F-measure to find good system parameters according to training data.
Publication details
DOI: 10.1007/978-3-319-12976-1_5
Full citation:
Tang, J. , Dannenberg, R. B. (2014)., Extracting commands from gestures: gesture spotting and recognition for real-time music performance, in M. Aramaki, O. Derrien, R. Kronland-Martinet & S. Ystad (eds.), Sound, music, and motion, Dordrecht, Springer, pp. 72-85.