开源软件名称(OpenSource Name):andreas-bulling/ActRecTut开源软件地址(OpenSource Url):https://github.com/andreas-bulling/ActRecTut开源编程语言(OpenSource Language):MATLAB 98.7%开源软件介绍(OpenSource Introduction):#A Tutorial on Human Activity Recognition Using Body-worn Inertial Sensors MATLAB toolbox for the publication A Tutorial on Human Activity Recognition Using Body-worn Inertial Sensors @article{bulling14_csur,
title = {A Tutorial on Human Activity Recognition Using Body-worn Inertial Sensors},
author = {Andreas Bulling and Ulf Blanke and Bernt Schiele},
url = {https://github.com/andyknownasabu/ActRecTut},
issn = {0360-0300},
doi = {10.1145/2499621},
year = {2014},
journal = {ACM Computing Surveys},
volume = {46},
number = {3},
pages = {33:1-33:33},
abstract = {The last 20 years have seen an ever increasing research activity in the field of human activity recognition. With activity recognition having considerably matured so did the number of challenges in designing, implementing and evaluating activity recognition systems. This tutorial aims to provide a comprehensive hands-on introduction for newcomers to the field of human activity recognition. It specifically focuses on activity recognition using on-body inertial sensors. We first discuss the key research challenges that human activity recognition shares with general pattern recognition and identify those challenges that are specific to human activity recognition. We then describe the concept of an activity recognition chain (ARC) as a general-purpose framework for designing and evaluating activity recognition systems. We detail each component of the framework, provide references to related research and introduce the best practise methods developed by the activity recognition research community. We conclude with the educational example problem of recognising different hand gestures from inertial sensors attached to the upper and lower arm. We illustrate how each component of this framework can be implemented for this specific activity recognition problem and demonstrate how different implementations compare and how they impact overall recognition performance.},
keywords = {}
} If you find the toolbox useful for your research please cite the above paper, thanks! HOWTOVersion 1.4, 19 August 2014 General Notes
How to reproduce the results from the paperExecute Specific notes on how to create and run your own experiment
SETTINGS.CLASSIFIER (default: 'knnVoting')
SETTINGS.FEATURE_SELECTION (default: 'none')
SETTINGS.FEATURE_TYPE (default: 'VerySimple')
SETTINGS.EVALUATION (default: 'pd')
SETTINGS.SAMPLINGRATE (in Hz, default: 32)
SETTINGS.SUBJECT (default: 1)
SETTINGS.SUBJECT_TOTAL (default: 2)
SETTINGS.DATASET (default: 'gesture')
SETTINGS.CLASSLABELS (default: {'NULL', 'Open window', 'Drink', 'Water plant',
'Close window', 'Cut', 'Chop', 'Stir', 'Book', 'Forehand', 'Backhand', 'Smash'})
SETTINGS.SENSOR_PLACEMENT (default: {'Right hand', 'Right lower arm', 'Right upper arm'})
SETTINGS.FOLDS (default: 26)
SETTINGS.SENSORS_AVAILABLE = {'acc_1_x', 'acc_1_y', 'acc_1_z', ...
'gyr_1_x', 'gyr_1_y', ...
'acc_2_x', 'acc_2_y', 'acc_2_z', ...
'gyr_2_x', 'gyr_2_y', ...
'acc_3_x', 'acc_3_y', 'acc_3_z', ...
'gyr_3_x', 'gyr_3_y'};
SETTINGS.SENSORS_USED (default: {'acc_1', 'acc_2', 'acc_3', 'gyr_1', 'gyr_2', 'gyr_3'})
Optional third-party libraries
|
2023-10-27
2022-08-15
2022-08-17
2022-09-23
2022-08-13
请发表评论