Repository | Book | Chapter

The 3D human motion control through refined video gesture annotation

Yohan Jin, Myunghoon Suk, B. Prabhakaran

pp. 551-564

In the beginning of computer and video game industry, simple game controllers consisting of buttons and joysticks were employed, but recently game consoles are replacing joystick buttons with novel interfaces such as the remote controllers with motion sensing technology on the Nintendo Wii [1] Especially video-based human computer interaction (HCI) technique has been applied to games, and the representative game is "Eyetoy" on the Sony PlayStation 2. Video-based HCI technique has great benefit to release players from the intractable game controller. Moreover, in order to communicate between humans and computers, video-based HCI is very crucial since it is intuitive, easy to get, and inexpensive. On the one hand, extracting semantic low-level features from video human motion data is still a major challenge. The level of accuracy is really dependent on each subject's characteristic and environmental noises. Of late, people have been using 3D motion-capture data for visualizing real human motions in 3D space (e.g, "Tiger Woods' in EA Sports, "Angelina Jolie" in Bear-Wolf movie) and analyzing motions for specific performance (e.g, "golf swing" and "walking"). 3D motion-capture system ("VICON") generates a matrix for each motion clip. Here, a column is corresponding to a human's sub-body part and row represents time frames of data capture. Thus, we can extract sub-body part's motion only by selecting specific columns. Different from low-level feature values of video human motion, 3D human motion-capture data matrix are not pixel values, but is closer to human level of semantics.

Publication details

DOI: 10.1007/978-0-387-89024-1_24

Full citation:

Jin, Y. , Suk, M. , Prabhakaran, B. (2009)., The 3D human motion control through refined video gesture annotation, in B. Furht (ed.), Handbook of multimedia for digital entertainment and arts, Dordrecht, Springer, pp. 551-564.

This document is unfortunately not available for download at the moment.