Center for Research in Comptuer Vision
Center for Research in Comptuer Vision

Matching Trajectories of Anatomical Landmarks Under Viepoint Anthropometric, and Temporal Transforms

An approach is presented to match imaged trajectories of anatomical landmarks (e.g. hands, shoulders and feet) using semantic correspondences between human bodies. These correspondences are used to provide geometric constraints for matching actions observed from different viewpoints and performed at different rates by actors of differing anthropometric proportions. The fact that the human body has approximate anthropometric proportion allows innovative use of the machinery of epipolar geometry to provide constraints for analyzing actions performed by people of different sizes, while ensuring that changes in viewpoint do not affect matching. In addition, for linear time warps, a novel measure, constructed only from image measurements of the locations of anatomical landmarks across time, is proposed to ensure that similar actions performed at different rates are accurately matched as well. An additional feature of this new measure is that two actions from cameras moving at constant (and possibly different) velocities can also be matched. Finally, we describe how dynamic time warping can be used in conjunction with the proposed measure to match actions in the presence of nonlinear time warps. We demonstrate the versatility of our algorithm in a number of challenging sequences and applications, and report quantitative evaluation of the matching approach presented.

Related Publications

Alexei Gritai, Yaser Sheikh, and Mubarak Shah, On the Invariant Analysis of Human Actions, 17th International Conference on Pattern Recognition, 2004.

Alexei Gritai, Yaser Sheikh, Cen Rao and Mubarak Shah, Matching Trajectories of Anatomical Landmarks under Viewpoint Anthropometric, and Temporal Transforms, International Journal of Computer Vision (IJCV), 2009.

Back to Human Action and Activity Recognition Projects