The Gesture Recognition Toolkit
Authors: Nicholas Gillian, Joseph A. Paradiso
JMLR 2014 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | The Gesture Recognition Toolkit is a cross-platform open-source C++ library designed to make real-time machine learning and gesture recognition more accessible for non-specialists. Emphasis is placed on ease of use, with a consistent, minimalist design that promotes accessibility while supporting flexibility and customization for advanced users. The toolkit features a broad range of classification and regression algorithms and has extensive support for building real-time systems. |
| Researcher Affiliation | Academia | Nicholas Gillian EMAIL Joseph A. Paradiso EMAIL Responsive Environments Group, Media Lab Massachusetts Institute of Technology Cambridge, MA 02139, USA |
| Pseudocode | Yes | The code example below demonstrates the core design principles of the GRT. This example shows how to setup a custom gesture-recognition system consisting of a moving-average filter preprocessing module, a fast Fourier transform and custom feature extraction modules, an Ada Boost classifier and a timeout-filter post processing module. //Setup a custom recognition pipeline 1: Gesture Recognition Pipeline pipeline; 2: pipeline.add Pre Processing Module( Moving Average Filter( 5 ) ); 3: pipeline.add Feature Extraction Module( FFT( 512 ) ); |
| Open Source Code | Yes | The gesture recognition toolkit is open source under the MIT license and has been publicly available since 2012, receiving over ten-thousand hits on the toolkit s website.4 It has been downloaded several thousand times and has built up a community of over 300 users on the toolkit s forum. ... 4. The toolkit s website can be found at: http://www.nickgillian.com/software/grt. |
| Open Datasets | No | The paper mentions loading a training dataset from a CSV file (e.g., 'training Data.load("Training Data.csv")') and states that the toolkit supports recording, labeling, and managing datasets. However, it does not provide concrete access information (link, DOI, citation) for any specific open dataset used or referenced in the paper. |
| Dataset Splits | No | The paper mentions that the toolkit supports 'partitioning data into validation and test data sets, running cross validation'. However, it does not provide specific details on dataset splits (e.g., percentages, sample counts, or methodology) for any particular dataset discussed or used in the paper. |
| Hardware Specification | No | The paper mentions commercial sensors like 'Microsoft Kinect' or 'Arduino' as sources of input data for the toolkit. However, it does not specify any hardware details (e.g., GPU/CPU models, memory) used by the authors to conduct their experiments or develop the toolkit itself. |
| Software Dependencies | No | The paper describes 'The Gesture Recognition Toolkit' as a 'cross platform open source C++ machine-learning library'. It also mentions other libraries like 'LIBSVM' which it wraps. However, it does not provide specific version numbers for any C++ compiler, operating system, or other software dependencies required to build or run the toolkit. |
| Experiment Setup | Yes | //Setup a custom recognition pipeline 1: Gesture Recognition Pipeline pipeline; 2: pipeline.add Pre Processing Module( Moving Average Filter( 5 ) ); 3: pipeline.add Feature Extraction Module( FFT( 512 ) ); 4: pipeline.add Feature Extraction Module( My Custom Feature Algorithm() ); 5: pipeline.set Classifier( Adaboost( Decision Stump() ) ); 6: pipeline.add Post Processing Module( Class Label Timeout Filter( 1000 ) ); |