Thursday, May 8, 2008

Real-Time Locomotion Control by Sensing Gloves (Komura 2006)

Summary:
The paper focuses on a method of mapping the hand motion of a user to locomotion of virtual characters in real time. The proposed method allows control of a multi-joint character through the motion of fingers whose joint angles are detected by a sensing glove. Animations are recorded via motion capture or generated using 3D modeling software and played back to a user. The user mimics the animation by moving his hands and the motion is used to generate a function that maps the hand motion to the standard animation. Then, new locomotion animations can be generated in real-time when the user's hand moves in different ways not necessarily within the domain of the motion used to generate the mapping function.
When calculating the correspondence between finger and animation motion, a cyclic locomotion is assumed in the calibration cycle. The period of each degree of freedom is calculated by finding the auto-correlation of the trajectories of that degree of freedom's signal. During play, the period of each joint in the animation is compared to the period of finger motion and the joints are classified as full cycle, half cycle, or exceptional. The hand's general coordinates are matched with the character's. Relative velocities of the character's end effectors to it's root are compared with relative velocities of fingertips to wrist to match each end effector with a finger. The motion of a joint is determined by the motion of finger(s) associated with the end effector(s) which are its descendants.
To test the method, mapping functions were generated for both human and dog walking animations. The animations were successfully extrapolated into running animations in real time. An experiment involving four subjects was conducted to compare the speed and number of collisions of a game character being navigated through a maze. On average, keyboard controlled locomotion took 9% less time, but resulted in more than twice as many collisions as sensing glove controlled locomotion.

Discussion:
An extension that would improve the realism of the motion would be animation blending. Currently, the system uses a single animation played at different speeds controlled by finger speed. However, a walking animation played back at double the intended rate does not look the same as a running animation. Animation blending uses some function to combine the weights of joint angles in an animation so that the resulting frames of animation transition naturally between animations, depending on speed.
During the control stage, when finger motion exceeds the original bounds of the mapping function generated in the calibration stage, extrapolation of the mapping function is performed for a limited space outside of the original domain. Instead of only considering the tangent (just the first derivative) to the mapping function at its endpoint, more derivatives could be considered to obtain a more accurate estimate of motion outside the original domain.
Controlling jumping motion of a character with the hand seems like a strange gameplay choice. If the hand position was mapped to character height, then holding the hand in the air would allow the player to hover, which is not realistic. If jumping were simulated in a physically realistic way, then mapping hand position to character height becomes pointless since all hand-height information would be discarded, except for the start time of the jump.

No comments: