In this site you'll find:
For more information, please contact fronchetti (at) lidi.unlp.edu.ar
How to cite:
(soon)
Since our model needs to be tested with different datasets, and each has it's own format, we provide scripts to convert from the original format to our own unified format.
In all cases, the result is a single .mat MATLAB/Octave file that contains all the information of the actions. The file can be opened in MATLAB/Octave with the load function.
The .mat file contains:
A struct array, gestures. Each element of gestures is a a struct with the information of a separate, segmented action instance.
A gesture struct contains 5 fields:
Download the script to convert from the datasets MSRC12, MSR Action3D, Berkley MHAD to our own format.
Additionally, the script animate_dataset.m can be used to view the gestures of the MSRC12 and Action3D dataset.
The MSRC12 is an action dataset with 12 gesture classes recorded by 30 subjects. In the homepage of the dataset more information can be found about it, along with a download link for a zip file with the action data. The zip contains several files, one for each action sequence. To convert the dataset to our format:
Download the dataset, and extract the data directory somewhere.
Download and extract the matlab code.
Download and extract the data folder of Hussein's et al. annotation, which contains the start:end metadata to segment the action sequences.
Use the sample script generate_dataset_from_msrc12.m to call the function msrc12_load_files with the path of the data directory of the MSRC12 dataset (with .csv and .tagstream), and the path to the folder containing Hussein's et al. extracted annotation (the one named sepinst with the .sep files). to generate the .mat file with our unified format.
We also provide a list of the action instances we excluded in MSRC12
The MSR Action3D is an action dataset with 20 gesture classes recorded by 10 subjects. In the homepage of the dataset more information can be found about it (and others!), along with a download link for a zip file with the action data. Be careful because the website contains information about many datasets: we used Ferda Ofli's version of the dataset , and that's the one you should download. The zip contains several files, one for each action sequence. To convert the dataset to our format:
Download the dataset, and extract to any directory.
Download and extract Jiang's Experiment File List, a commonly employed whitelist.
Download and extract the matlab code.
Use the sample script generate_dataset_from_action3d.m to call the function action3d_read_and_convert with the path of the directory of the original dataset and the path to Jiang's Experiment File List to generate the .mat file with our unified format.
Note that this process also reorders the joints of the actions, so that they match the order in the MSRC12 dataset (the default for Kinect devices)
We also provide a list of the action instances we excluded and corrected
(Soon)