Gesture Recognition has a prominent importance in smart environment and home automation. Thanks to the availability of Machine Learning approaches it is possible for users to define gestures that can be associated with commands for the smart environment.
In this project we propose a Random Forest-based approach for Gesture Recognition of hand movements starting from wireless wearable motion capture data. In the presented approach, we evaluate different feature extraction procedures to handle gestures and data with different duration.
.
├── src/ : Contains report images
├── GR_Dataset.mat : Gesture dataset as MATLAB table
├── Gesture_Classification.m : Main routine
├── gesturePreprocessing.m : Preprocessing function for feature extraction
├── custom_MCCV.m : Custom function for Stratified Monte Carlo Cross Validation
├── Mdl_best_<PREPROCESSING_TYPE>.mat : Models pretrained on <PREPROCESSING_TYPE> features and ready to use
└── README.md : Project Report
The number of single executions for each gesture category-mode pair is reported in the table below.
The dataset, complete of the raw Motion Capture data, can also be retrived in a more python-oriented format (.pickle
) in the GitLab page of the Department of Information Engineering of the University of Padova.
The detailed results of this project have been presented at the 5th IFAC Conference on Intelligent Control and Automation Sciences at ICONS 2019 in Belfast, United Kingdom, and the complete report has been published by Elsevier. 📝
I wish to thank my project supervisors and co-authors A. Cenedese, G.A. Susto, M. Carletti, and M. Terzi for their support and their advices.