Skip to content

ushadow/handinput

Repository files navigation

handinput

Real-time hand tracking and gesture recognition system based on PhD thesis: Real-time Continuous Gesture Recognition for Natural Multimodal Interaction.

Dependencies

To run

To compile

Run

GesturesViewer.exe in the GesturesViewer project is the main interface to run the program. Once the program starts, the "Keys" panel shows the shortcut keys for certain actions. The most important keys are:

  • S: Start the kinect, no gesture recognition.
  • T: Start tracking and gesture recognition. The gestures are defined in gesture_def.txt. See the illustraion of how to do these gestures.

To improve the accuracy of gesture recognition, you need to train your own model.

  1. Record training examples
  2. Click "Capture Gesture" button.
  3. Follow the prompt to give training gesture examples. The gesture raw data will be saved in the {data_dir}/PID-{user_pid}/{time} directory. data_dir s specified in the GesturesViewer/App.config file.
  4. In the end, the program will process and train a new model using all the data recorded in the data_dir directory.
  5. Press "T" to start tracking and gesture recognition

How to interpret the recognition result

The gesture tracking and recognition result outputs the follow result in a JSON string for each frame: { eventType: <type of geseture event: StartPreStroke|StartNucleus|StopNucleus>, gesture: <name of the gesture>, phase: <PreStroke|Nucleus|PostStroke|Rest>, rightX: <x coordinate of right hand>, rightY: <y coordinate of right hand>}

Modules

  • GesturesViewer: UI Interface for recording geseture training examples and viewing debug information.
  • Util: reusable utility functions.