User Tools

Site Tools


manual:gestures:pointcloud

Detecting Custom Gestures

Starting with FingerGestures 3.0, it's now possible to detect custom gestures using the PointCloudRecognizer. The implementation of the gesture pattern matching algorithm is based on the $P recognizer. It currently supports single-stroke gestures only, but support for multi-strokes will be added in a future update.

The PointCloudRecognizer will compare the user input against a set of gesture templates and return the closest match, along with a matching score and distance value.

PointCloud gestures are scale and direction invariant, which means they will be recognized even if the user draws them bigger/smaller or in the opposite direction as the gesture was recorded (e.g. right to left instead of left to right). They are however not rotation invariant, so the orientation of the gesture matters.

PointCloud Gesture Templates

A PointCloud gesture template contains the data representing a single gesture pattern you want to match. Here's an in-editor preview of the loop gesture template included with the samples:

To create a new PointCloud gesture template from the editor, follow the steps below.

Step 1

In your Project panel, right click anywhere to bring up the context menu and click Create > PointCloud Gesture Template at the bottom.

A new pointcloud gesture template file/asset will be added to your project. Rename it to “MyGestureTemplate” or anything else that's meaningful to you.

Step 2

Select the gesture template and ensure it's visible in the Inspector panel. You should see an empty Gesture Preview region at the bottom, and a big “Edit” button at the top.

Click on the Edit button. The “Gesture Editor” window should show up.

Step 3

Click on the window and paint the your desired gesture. In the example below, I painted a B shape. You may retry again as many times as you want.

Once you are satisfied with the result, click the Apply button to save the changes. If you click on the template file again in the assets browser, you should now see your new gesture in the inspector preview:

Using the PointCloudRecognizer

Now that we have a gesture template, let's see how we can get it recognized at runtime.

Step 1

  1. Make sure your scene is setup properly
  2. Create a new “Gestures” object
  3. Add a PointCloudRecognizer component to it

We are particularly interested in the following properties:

  • Max Match Distance: threshold value that controls how accurate the user-generated gesture must be in order to match its corresponding template gesture. The lower the value, the more accurate the user must be.
  • Sampling Distance: minimum distance between two consecutive finger position samples. Smaller means more accurate recording of the gesture, but more samples to process.
  • Gesture Templates List: this is where we can specify the collection of gesture templates we want to match against.

Step 2

Add your “MyCustomGesture” template to the Gesture Templates List by dragging it on the property field, in the inspector:

Our custom gesture recognizer is now ready to be used. All we need to do now is listen to the gesture event and react to it.

Step 3

  1. Create a new C# script called PointCloudTutorial.cs
  2. Add it to our “Gestures” object, below the PointCloudRecognizer component
  3. Edit the PointCloudTutorial script to look like this:
public class PointCloudTutorial : MonoBehaviour
{
    void OnCustomGesture( PointCloudGesture gesture ) 
    {
        Debug.Log( "Recognized custom gesture: " + gesture.RecognizedTemplate.name + 
            ", match score: " + gesture.MatchScore + 
            ", match distance: " + gesture.MatchDistance );
    }
}

The gesture event data contains several important properties:

  • gesture.RecognizedTemplate references the PointCloudTemplate that was deemed to be the best match
  • gesture.MatchScore is a percent value of how closely the user's gesture matched the template (1.0 means perfect match)
  • gesture.MatchDistance is an absolute measurement of how closely the user's gesture matched the template

As usual, you can also access all the other common gesture event properties such as position and selected object.

Creating Gesture Templates From Code

The gesture editor isn't the only way to create custom gesture templates. You can use the PointCloudTemplate API to create gesture templates at runtime or from your own editor extension. This can be useful to let the user record his own gestures in your application.

Here's a code sample that creates a PointCloudRecognizer and two gesture templates at runtime:

void Awake()
{
    PointCloudGestureTemplate triangle = ScriptableObject.CreateInstance<PointCloudGestureTemplate>();
    triangle.name = "Triangle Gesture Template";
    triangle.BeginPoints();
    triangle.AddPoint( 0, 1, 1 );
    triangle.AddPoint( 0, 2, 2 );
    triangle.AddPoint( 0, 3, 1 );
    triangle.AddPoint( 0, 1, 1 );
    triangle.EndPoints();
 
    PointCloudGestureTemplate square = ScriptableObject.CreateInstance<PointCloudGestureTemplate>();
    square.name = "Square Gesture Template";
    square.BeginPoints();
    square.AddPoint( 0, 2, 1 );
    square.AddPoint( 0, 2, 3 );
    square.AddPoint( 0, 4, 3 );
    square.AddPoint( 0, 4, 1 );
    square.AddPoint( 0, 2, 1 );
    square.EndPoints();
 
    PointCloudRegognizer recognizer = gameObject.AddComponent<PointCloudRegognizer>();
    recognizer.AddTemplate( triangle );
    recognizer.AddTemplate( square );
}

The first parameter of AddPoint is the stroke index, but it's not used yet due to the PointCloudRecognizer only supporting single-stroke gestures right now. Multi-stroke support will be added at a later point.

When EndPoints() is called, the gesture template is normalized, meaning that all the points are remapped to the 0..1 range.

manual/gestures/pointcloud.txt · Last modified: 2013/03/12 12:21 by wravaine