Article (Scientific journals)
UMONS-TAICHI: A multimodal motion capture dataset of expertise in Taijiquan gestures
Tits, Mickaël; Laraba, Sohaib; Caulier, Eric et al.
2018In Data in Brief, 19, p. 1214-1221
Peer reviewed
 

Files


Full Text
1-s2.0-S2352340918305948-main.pdf
Publisher postprint (1.81 MB)
Download

All documents in ORBi UMONS are protected by a user license.

Send to



Details



Abstract :
[en] In this article, we present a large 3D motion capture dataset of Taijiquan martial art gestures (n = 2200 samples) that includes 13 classes (relative to Taijiquan techniques) executed by 12 participants of various skill levels. Participants levels were ranked by three experts on a scale of [0-10]. The dataset was captured using two motion capture systems simultaneously: 1) Qualisys, a sophisticated optical motion capture system of 11 cameras that tracks 68 retroreflective markers at 179 Hz, and 2) Microsoft Kinect V2, a low-cost markerless time-of-flight depth sensor that tracks 25 locations of a person׳s skeleton at 30 Hz. Data from both systems were synchronized manually. Qualisys data were manually corrected, and then processed to complete any missing data. Data were also manually annotated for segmentation. Both segmented and unsegmented data are provided in this dataset. This article details the recording protocol as well as the processing and annotation procedures. The data were initially recorded for gesture recognition and skill evaluation, but they are also suited for research on synthesis, segmentation, multi-sensor data comparison and fusion, sports science or more general research on human science or motion capture. A preliminary analysis has been conducted by Tits et al. (2017) [1] on a part of the dataset to extract morphology-independent motion features for skill evaluation. Results of this analysis are presented in their communication: 'Morphology Independent Feature Engineering in Motion Capture Database for Gesture Evaluation' (10.1145/3077981.3078037) [1]. Data are available for research purpose (license CC BY-NC-SA 4.0), at https://github.com/numediart/UMONS-TAICHI.
Disciplines :
Electrical & electronics engineering
Author, co-author :
Tits, Mickaël ;  Université de Mons > Faculté Polytechnique > Information, Signal et Intelligence artificielle
Laraba, Sohaib ;  Université de Mons > Faculté Polytechnique > Service Information, Signal et Intelligence artificielle
Caulier, Eric
Tilmanne, Joëlle ;  Université de Mons > Faculté Polytechnique > Service Information, Signal et Intelligence artificielle
Dutoit, Thierry ;  Université de Mons > Faculté Polytechnique > Service Information, Signal et Intelligence artificielle
Language :
English
Title :
UMONS-TAICHI: A multimodal motion capture dataset of expertise in Taijiquan gestures
Publication date :
01 August 2018
Journal title :
Data in Brief
Publisher :
Elsevier, New York, United States - New York
Volume :
19
Pages :
1214-1221
Peer reviewed :
Peer reviewed
Research unit :
F105 - Information, Signal et Intelligence artificielle
Research institute :
R300 - Institut de Recherche en Technologies de l'Information et Sciences de l'Informatique
R450 - Institut NUMEDIART pour les Technologies des Arts Numériques
Available on ORBi UMONS :
since 05 October 2018

Statistics


Number of views
2 (0 by UMONS)
Number of downloads
29 (0 by UMONS)

Scopus citations®
 
8
Scopus citations®
without self-citations
8

Bibliography


Similar publications



Contact ORBi UMONS