Menu

EXPLORE INTERHAPTICS

Hand Tracking

We believe hand interaction is in the future of XR.

Great UX for Final User

"It just works", this what a lot of users say after testing developed applications powered by Interhaptics.

Cross Compatibility with All Hand Tracking on the Market

We take care of the background integration to allow you to focus on creating.

Hand Snapping Realistic Management

Great XR experiences have great management of hands in XR. We give you a completely customizable tool to get great results within little time.

Completely Controllable by the Developer

You can modify and customize the hand behavior as you like. Nothing is written in stone.

All great, but how does all this work?
We will get a little technical here.

The Interhaptics engine is a real-time 3D interaction and haptics rendering process.

The Logical Input Problem for Hand Tracking

Hand tracking is an amazing piece of technology allowing a transparent interface with the mixed reality world. Users respond naturally while using hand tracking, and it has the potential to create amazing user experiences.

One of the current challenges of hand tracking is how to translate an analogic input (the hands) into a logical one interpreted by your software. Bonus points if it works seamlessly and without any learning curve for 90 % of the population with a satisfactory experience.

How to achieve that?

Humans already know how to use hands proficiently; the trick is to generate a software layer that allows users to use their hands in XR in the most natural manner without adding useless frustration.

To achieve that, we took inspiration from the science of objects manipulation in humans: Haptics.

Our Solution

To reach our ambitious objectives we segmented the grasping and interaction mechanism into logically independent modules and optimized each one based on human behavior and perception.

Interhaptics segments the grasping and interaction mechanism into logically independent modules.

Grasp Recognition

When we look in detail on how humans grasp and manipulate objects we find the following map.

A map that categorizes how humans grasp and manipulate objects.

Source: T. Feix, J. Romero, H. Schmiedmayer, A. M. Dollar and D. Kragic, The GRASP Taxonomy of Human Grasp Types, in IEEE Transactions on Human-Machine Systems, vol. 46, no. 1, pp. 66–77, Feb. 2016.


This subdivision is fundamentally split into two major categories: Power and Precision manipulation. The division is as a function of how the position of the object is controlled. Finger pad control for precision manipulation and hand palm control for power manipulation.

What we see in this division is a radically different way of how humans manipulate different objects. For interaction purposes, we should also consider how the pre-manipulation action is made to accommodate the final position.




Finger pad control for precision manipulation and hand palm control for power manipulation.

Grasp Threshold

Implementing different algorithms as a function of the human grasping movement is not enough to capture a solid model of human interacting with objects.

To complete the grasping optimization, we implemented a tunable grasping strength threshold which is determined by the size of the object the user would like to grasp. This feature allows accounting for the different human pre-manipulation strategies.

We implemented a tunable grasping strength threshold function of the size of the object the user wants to grasp.

This allows adjusting the response of the software to the natural human behavior of the user while using the application.

The initiation phase of a grasping 3D interaction can be now optimized for a large panel of human behaviors, both on the pre manipulation movements, and object manipulation phase.




3D Transformations and Ending Conditions

The next step is to manage the performing of the 3D interaction. We developed here the concept of interaction primitive. More information here:

Concept of Interaction Primitive: starting condition, transformation, and ending condition.

Where the starting condition can be given by the grasping poses estimation described beforehand, as well as from contact with objects, button clicks, or other actions on 3D objects.

Interhaptics includes large numbers of 3D interactions humans constantly use, and that are recurrently implemented in XR applications to implement interactivity. Some of them are described here:



Strategy Description Use Cases
Free Movement The object moves freely in rotation and position. Useful for most common object that you will grab or pinch in the environment (tool, components, pen, mug, etc.)
Rotation around pivot* The object movements are constrained to a rotation around an axis. Everything like doors, levers, valves, wheels.
Sliding* The object movements are constrained to follow a specific axis. Obviously, great value for any sliding objects (door, box cap, UI slider, drawers, buttons…
Free + fix rotation*

Free + Y rotation
Modified version of the free movement to apply constraints on specific rotations. Mostly used for big objects that you want to move freely. Having fixed rotations allows to avoid big objects spinning / rotating fast and confusing the user.



*More spatial constraints can be applied to define axis of rotation/sliding as well as limits of rotations and sliding.

All strategies can be triggered by contact from hand or fingers, grab or pinch.

These transformations are implemented within the Interhaptics engine. You can read more here on how it works.




Hand Snapping and Poses

There are fundamentally 2 ways to manage hands while interacting with virtual objects: leave hands free to behave naturally while interacting, or forcing its position on the expected one by the user while interacting with a predetermined virtual object.

Demonstration explaining the difference between snapping and free hand interactions.

With Interhaptics you can have both, just by setting the parameter you like in the development environment. Test it out and use the one you feel more natural. If you want to modify it, go ahead. You can do it.




Naturally Inspired Delays and Tracking Noise Rejection

All the techniques described above work well in an ideal case. There are two factors we should consider allowing the technology to work seamlessly:

1.Software is way faster than the human response time.
2. Hand tracking signal is messy and noisy, and it will be messy and noisy for the foreseeable future.

Where it has been seen this problem before in technology? The acquisition data of capacitive touch screen in your phone is noisy and unreliable.

We applied a lot of the techniques used in HMI for stabilizing 2D interactions in phones to generate a great user experience for hand tracking users. Apple is the master in this field, and we humbly took inspiration and adapted for 3D interactions.

Controllers and Third-Party Interface Porting

All of this is great, but what if I want my scenario to work also with controllers? And maybe with the touch screen of my Ipad?

No problem! Check out the Interhaptics engine implementation. With the use of a solid logical architecture of the interaction-based system, everything can be ported on almost any platform with solid results. We will work a lot in this direction soon.

Return to Top

Click to return   

Partnership Form

Tell us more about yourself and your organization, and identify ways in which your partnership with us would yield high-impact results.


Successfully submitted. Thank you!


We will do our best to respond to you promptly.

This website uses cookies to enhance your experience.
By proceeding to browse this website, you agree to our privacy & cookies policy.


Got it!