Menu

Hand Tracking

We believe hand interaction is the future of virtual reality.

Great UX for Final User

"It just works!" This is what a lot of users have to say after testing developed applications powered by Interhaptics. Using Interhaptics tools can create unforgettable user experiences. It ensures that your virtual reality projects are made for the finest hand tracking systems and with high quality interactions.

Cross Compatibility with All Hand Tracking on the Market

Compatibility is just one less thing for you to worry about with Interhaptics! We take care of the background integration so you can focus on creating amazing virtual experiences for your end users. Using Interhaptics tools, you can deploy your creations to a variety of platforms, consoles, and controllers.

Realistic Hand Snapping Management

Great virtual reality experiences require precise management of hand behavior and interaction. We give you a completely customizable tool so you can achieve the best results within little time. With our hand snapping feature, the Interaction Builder makes it possible to interact with virtual objects in realistic ways, allowing users to hold and use virtual objects as though they were real items in their hands.

Completely Controllable by the Developer

Nothing is written in stone. You can modify and customize the hand behavior as you like.

All great, but how does hand tracking work?
Let's get technical.

The Interhaptics Engine is a real-time 3D interactions and haptics rendering process. A large part of its power comes from its ability to recreate hand interactions that are high-quality, precise, and believable. How did we achieve this?

The Logical Input Problem for Hand Tracking

Hand tracking is an amazing piece of technology that provides a transparent interface with the mixed reality world. Users respond naturally while using hand tracking, and it has the potential to create amazing user experiences throughout a variety of extended reality applications.

One of the current challenges of hand tracking is how to translate an analogic input (the hands) into a logical input that can be interpreted by your software. Bonus points if it works seamlessly and without any learning curve for 90% of the population, offering them a satisfactory experience.

As you can imagine, this is not a straightforward process, but Interhaptics is taking care of it for you.

How to achieve that?

Humans already know how to use hands proficiently; the trick is to generate a software layer that allows users to use their hands in various virtual environments in the most natural manner without adding useless frustration.

To achieve that, we took inspiration from the science of human object manipulation: Haptics.

Our Solution

To reach our ambitious objectives we segmented the grasping and interaction mechanism into logically independent modules and optimized each one based on human behavior and perception.

Interhaptics segments the grasping and interaction mechanism into logically independent modules.

Grasp Recognition

When we look in detail on how humans grasp and manipulate objects we find the following map.

A map that categorizes how humans grasp and manipulate objects.

Source: T. Feix, J. Romero, H. Schmiedmayer, A. M. Dollar and D. Kragic, The GRASP Taxonomy of Human Grasp Types, in IEEE Transactions on Human-Machine Systems, vol. 46, no. 1, pp. 66–77, Feb. 2016.


This subdivision is fundamentally split into two major categories: Power and Precision manipulation. The division is as a function of how the position of the object is controlled. Power manipulation consists of hand-palm control, while precision manipulation involves finger control.

What we see in this division is a radically different way of how humans manipulate different objects. For interaction purposes, we should also consider how the pre-manipulation action is made to accommodate the final position, since modeling these interactions in a virtual environment requires the full spectrum of the movements involved.




Finger pad control for precision manipulation and hand palm control for power manipulation.

Grasp Threshold

Implementing different algorithms as a function of the human grasping movement is not enough to capture a solid model of human interaction with objects.

To complete the grasping optimization, we implemented a tunable grasping strength threshold which is determined by the size of the object the user would like to grasp. This feature allows accounts for the different human pre-manipulation strategies.

We implemented a tunable grasping strength threshold function of the size of the object the user wants to grasp.

This allows adaptation of the Interhaptics Engine to the natural human behavior of the user while using the application.

The initiation phase of a grasping 3D interaction can be now optimized for a large panel of human behaviors, in both the pre-manipulation and object manipulation phases.




3D Transformations and Ending Conditions

The next step is to manage the performing of the 3D interaction. We developed here the concept of interaction primitive. More information here:

Concept of Interaction Primitive: starting condition, transformation, and ending condition.

The starting condition in the diagram above can be given by the grasping poses estimation described beforehand, as well as from contact with objects, button clicks, or other actions on 3D objects.

Interhaptics includes large numbers of 3D interactions that humans constantly use, and that are recurrently implemented in virtual reality and mixed reality applications to implement interactivity. Some of them are described here:



Strategy Description Use Cases
Free Movement The object moves freely in rotation and position. Useful for most common object that you will grab or pinch in the environment (tool, components, pen, mug, etc.)
Rotation around pivot* The object movements are constrained to a rotation around an axis. Everything like doors, levers, valves, wheels.
Sliding* The object movements are constrained to follow a specific axis. Obviously, great value for any sliding objects (door, box cap, UI slider, drawers, buttons…
Free + fix rotation*

Free + Y rotation
Modified version of the free movement to apply constraints on specific rotations. Mostly used for big objects that you want to move freely. Having fixed rotations allows to avoid big objects spinning / rotating fast and confusing the user.



*More spatial constraints can be applied to define axis of rotation/sliding as well as limits of rotations and sliding.

All strategies can be triggered by contact from hand or fingers, grab or pinch.

These transformations are implemented within the Interhaptics Engine. You can read more here on how it works.




Hand Snapping and Poses

There are fundamentally 2 ways to manage hands while interacting with virtual objects: leave hands free to behave naturally while interacting, or snapping them into the position expected by the user while interacting with a predetermined virtual object.

Demonstration explaining the difference between snapping and free hand interactions.

With Interhaptics you can implement both methods by simply setting your preferred parameters in the development environment. Test it out and use the one you feel is most natural and effective in your virtual experience. You can also make any modifications to these parameters, creating unique hand snapping interactions.




Naturally Inspired Delays and Tracking Noise Rejection

All the techniques described above work well in an ideal case. There are two factors we should consider allowing the technology to work seamlessly:

1.Software is way faster than the human response time.
2. Hand tracking signal is messy and noisy, and it will be messy and noisy for the foreseeable future.

Where have we encountered this technological issue before? One example is the acquisition data of a capacitive touch screen in smartphones, which can be noisy and unreliable.

We applied a lot of the techniques used in HMI (human-machine interface) for stabilizing 2D interactions in phones to generate a great user experience for hand tracking users.

Controllers and Third-Party Interface Porting

All of this is great, but what if I want my virtual scenario to work with controllers?

No problem! Check out the Interhaptics Engine implementation. With the solid logical architecture of our interaction-based system, everything can be ported on almost any platform with solid results. We will work a lot in this direction soon.

Return to Top

Click to return   

Partnership Form

Tell us more about yourself and your organization, and identify ways in which your partnership with us would yield high-impact results.


Successfully submitted. Thank you!


We will do our best to respond to you promptly.

This website uses cookies to enhance your experience.
By proceeding to browse this website, you agree to our privacy & cookies policy.


Got it!