Hand clamping - MotionTooler/MotionToolsWiki GitHub Wiki

The software is capable of detecting whether or not a user is trying to grab GameObjects. This is done using hand clampers. Start by going to the Neuron Generic Animator Instance in the inspector and choosing either of the two currently implemented hand clampers.

Hand clampers

Bone Angle

Bone angle is the most basic variant of two clampers and has only 1 setting. Namely the angle for which we assume that the index finger is grasping an object. If the current angle is greater than the setting, the objects will clamp to the corresponding hand. This only works for skeletons with an index finger

Hidden markov model

A hidden markov model can be used for more advanced clamp recognition by looking at the entire hand instead of just the index finger. For both hands, a model needs to be trained which is explained in the Training section. The left hand model must be a relative or absolute file path containing the trained hidden Markov model for the left hand. The right hand model must also be a relative or absolute file path with the model for the right hand.

Clampable GameObjects

To enable GameObjects to be picked up by the user, it needs to have a component script called Clampable Game Object. This script offers two options. The Clamping checkbox enables or disables the clamping completely. Clamp Distance is the maximal amount of distance units on which the GameObject still clamps to the hand. For example, when setting this setting to a high value like 100, the objects will clamp to the hand even though they are not even close to each other. Increase this number to suit the specific Clampable Game Object's needs.

ClampableGameObject

Trainer

It is also possible to train your own markov classifier, using a set of prerecorded input data. In this part we will first explain how to record the data, and after that how to use the recorded data to train a classifier.

Data Recording

You can record motion data of a skeleton by adding a "Hand Motion Recording" script to a skeleton. In this script you can specify whether you want to record the left hand, whether you want to record the right hand, and where to save the data of the hands to be recorded.

Hand Motion Recording

When this script is set up, you just start the scene, and start moving. When all the recorded data is gathered, you can stop the scene again. At that moment, all recorded data will be saved to the specified files.

Markov model Trainer

After the data is recorded, it is possible to train a new hidden markov model. To access this feature, navigate to Neuron > Motion Tools > Hand Clamp Trainer. In that dialog, you can add the input files containing the captured data, and the output file, where the markov model should be stored.

Hand Motion Recording

After pressing Start, the markov model will be trained. Depending on the input data size, this may take up to several minutes. When the training is finished, the model in the output file can be used with the Generic Animator as described in the previous section.