Summary:
The motivation for this paper comes from the idea that humans performing gestures can be transferred to an electronic instrument to produce an emotional musical performance without incurring the difficulties of the human learning the technique for controlling a specific instrument. The assumption that emotion is better portrayed by the force applied to an object than by the position of the hand on an object is also crucial to the work, which uses three-dimensional acceleration sensors to capture gesture motion. The acceleration sensor data has poor quantitative reproducibility; so instead of using simple pattern matching, global features of the motions are extracted. In the study, the magnitude of the acceleration change in eight principal directions, rotational direction, intensity of motion, and z direction of motion are used as characteristics for gesture recognition.
A user must go through training the system, which builds up a standard pattern of data, before recognition takes place so thresholds can be set on an individual level. Gesture segmentation is triggered by the intensity of acceleration. Averages and standard deviations of acceleration data are calculated and used as a basis for comparison during gesture recognition. Ten types of gestures are recognized and used to control the performance of MIDI music. The user whose gesture data trained the system was able to perform gestures with a 100% recognition rate, while another user's gestures were misclassified for particular gestures.
A tempo prediction model based on the previous two tempos is used to allow for smoother performance of the music than could be achieved by the system waiting for the recognition of a human determined tempo.
Discussion:
The tempo recognition based on change in acceleration magnitude is more accurate than image based methods. This reduction of phase delay in the detection of tempo is important when the control issue is specifically related to timing as it is in music control. Some of the gestures used in the study did not seem naturally related to music composition, such as the star, triangle, and heart shaped motions. The paper also did not describe how gestures were mapped to effects on the music, besides the downward motion being mapped to tempo.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment