English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Conference Paper

Marker-less 3D Feature Tracking for Mesh-based Motion Capture

MPS-Authors
/persons/resource/persons43977

de Aguiar,  Edilson
Computer Graphics, MPI for Informatics, Max Planck Society;

/persons/resource/persons45610

Theobalt,  Christian       
Computer Graphics, MPI for Informatics, Max Planck Society;

/persons/resource/persons45557

Stoll,  Carsten
Computer Graphics, MPI for Informatics, Max Planck Society;

/persons/resource/persons45449

Seidel,  Hans-Peter       
Computer Graphics, MPI for Informatics, Max Planck Society;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

de Aguiar, E., Theobalt, C., Stoll, C., & Seidel, H.-P. (2007). Marker-less 3D Feature Tracking for Mesh-based Motion Capture. In A. Elgammal, B. Rosenhahn, & R. Klette (Eds.), Human Motion – Understanding, Modeling, Capture and Animation (pp. 1-15). Berlin: Springer. doi:10.1007/978-3-540-75703-0_1.


Cite as: https://hdl.handle.net/11858/00-001M-0000-000F-1FC3-0
Abstract
We present a novel algorithm that robustly tracks 3D trajectories of features on a moving human who has been recorded with multiple video cameras. Our method does so without special markers in the scene and can be used to track subjects wearing everyday apparel. By using the paths of the 3D points as constraints in a fast mesh deformation approach, we can directly animate a static human body scan such that it performs the same motion as the captured ubject. Our method can therefore be used to directly animate high quality geometry models from unaltered video data which opens the door to new applications in motion capture, 3D Video and computer animation. Since our method does not require a kinematic skeleton and only employs a handful of feature trajectories to generate ifelike animations with realistic surface deformations, it can lso be used to track subjects wearing wide apparel, and even nimals. We demonstrate the performance of our approach using several captured real-world sequences, and also validate its accuracy.