Menu

EXPLORE INTERHAPTICS

Interhaptics Engine

The Interhaptics engine is the workhorse powering Interhaptics.

Haptics and 3D Interactions Rendering for all devices

The Interhaptics engine renders real-time high-fidelity Haptics and 3D interactions for XR and mobile.

Spatial Haptics

Render precise and refined spatial details of your haptics experience. Find satisfactory and discernible button clicks, soft and hard objects, as well as refined spatial details of haptics textures. 2D and 3D spaces are not an issue for the Interhaptics engine.



Learn More

Haptics Hyper Sampling

Get the best Haptics experience with todays and future technologies. The Haptics Hyper sampling technology of Interhaptics renders Haptics above 1 kHz, and up to 1-micrometer resolution.



Learn More

3D Interactions Management

Compute and render in real-time 3D interactions for hands. The Interhaptics engine allows you to easily port your creation on any platform thanks to the external computation.



Learn More

Collision and Event-Based Haptics

Trigger haptics based on contact with virtual objects, external events, and parametric interactions.



Learn More

All great, but how does all this work?
We will get a little technical here.

The Interhaptics engine is a real-time 3D interaction and haptics rendering process.


Here are some features:

Spatial Haptics and 4 perceptions

Interhaptics takes inspiration from the human perception of the sense of touch. There are 20-25 discernible sensations that humans can perceive with the touch sense, existing Haptics devices can render convincingly just 4 of them: Vibrations, Textures, Stiffnesses, and Temperatures. Each of these perceptions has one solid perceptual characteristic on its basis. We extracted and modeled them:

Existing Haptic devices can render just 4 types of sensations: Vibrations, Textures, Stiffnesses, and Temperature.

2 of them, Textures and Stiffnesses are perceptually encoded in space (faster you move, faster the sensation happens), and 2 of them are encoded in time. Let's get the example of a button. The sensation we perceive when we touch it is as a function of how much we press, then a click happens and the force suddenly drops down:

In the scenario of pressing a button. The sensation we perceive is function of how much force one exerts when pressing.

This behavior is intrinsic to how humans perceive touch sensations. All of these algorithms and concepts are included within the Interhaptics engine and they are computed in real-time to give the best possible haptics experience to users.

Haptics Hyper Sampling

There is a well-known problem in the field of haptics referring to signal processing theory: the data acquisition rate of the system determines the accuracy of your processed signal. Position acquisition is fundamental to render textures and stiffnesses, and XR and mobile systems have position acquisition rates up to 120 Hz. Where is the catch? Haptics, to work properly, requires rendering frequencies in the 1 kHz region. We encounter here the well-known problem of under-sampling: you can have the best haptics actuator on the planet, but if you cannot control it fast enough, it is pretty useless.

(see https://link.springer.com/chapter/10.1007/978-3-319-42321-0_23 and https://dl.acm.org/doi/abs/10.1145/3025453.3026010 for broader academic discussion of the problem and case specific solutions)

The data acquisition rate of the Haptic system determines the accuracy of your Haptic Feedback.

Interhaptics solves this problem in a general manner. We included a model of the human motor behavior and haptics perception within the Interhaptics engine, and we use this model to hyper sample the human position and give a stable haptics perception for spatial and time-based haptics.

Using this approach, Interhaptics can create an advance, perceptually accurate haptics rendering.

This allows Interhaptics to create a perceptually accurate haptics rendering, far above the ability of the tracking system to acquire the position. The method works because we considered touch perception and the limit of human motor behavior within the rendering algorithm.

3D Interactions Real Time Rendering

One of the issues while deploying your 3D application is how to manage the different inputs and outputs on various platforms. There are several kinds of inputs: Hands, controllers, joypads, buttons, or even touch screens for an AR application. Fundamentally, these interfaces do the same job during interactions: define an input, execute a spatial transformation, and meet an ending condition.

Most input interfaces define an input, execute a spatial transformation, and meet an ending condition.

While being in larger numbers compared with 2D gestures such as drag and drop, or pinch and zoom, 3D interactions can be categorized in pre-set of useful spatial transformation for human Human-Machine Interaction. We categorized a large set of 3D interaction primitives covering much of the hand and finger input used in today's 3D applications. We encoded their behavior within the Interhaptics engine for a faster development process.



Strategy Description Use Cases
Free Movement The object moves freely in rotation and position. Useful for most common object that you will grab or pinch in the environment (tool, components, pen, mug, etc.)
Rotation around pivot* The object movements are constrained to a rotation around an axis. Everything like doors, levers, valves, wheels.
Sliding* The object movements are constrained to follow a specific axis. Obviously, great value for any sliding objects (door, box cap, UI slider, drawers, buttons…
Free + fix rotation*

Free + Y rotation
Modified version of the free movement to apply constraints on specific rotations. Mostly used for big objects that you want to move freely. Having fixed rotations allows to avoid big objects spinning / rotating fast and confusing the user.



*More spatial constraints can be applied to define axis of rotation/sliding as well as limits of rotations and sliding.



All strategies can be triggered by contact from hand or fingers, grab or pinch.

These interaction primitives work pretty much like drag and drop, or pinch and zoom algorithms implemented in touch panels in your smartphone. The Interhaptics engine allows transporting these interaction primitives on any platform, without rewriting any code.

This approach turbocharges the cross-compatibility of your interactive application between several platforms, with no code to be done to port any application!

Collision, Event-Based, and Interaction-Based Haptics

Last but not the least, is how do you include haptics into your applications?

There are 2 separate ways to historically include haptics into applications. One is based on physics simulation of contact and forces (touch and object, feel haptics feedback), and the other is based on feedback generated following events: click a button, feel predetermined feedback for confirmation.

Interhaptics allows both methods of development. You can set your stiffness to be rendered on the contact with an object, as well as deliver a specific vibration pattern while an event happens.

One more thing.

Once accounted for spatial haptics and 3D interactions in the same engine, we unlock a third method of including haptics within your application: Haptics as a function of interactions.

This third method gives application developers large freedom to conceptually create satisfactory and engaging interactions and haptic experiences in few clicks.

Create superb, realistic, and engaging interactions and haptic experiences in only a few clicks.

All these concepts together make the Interhaptics engine a unique piece of technology providing a transparent interface for application developers and final users. You can scale the great experience you designed for your final users and reach them on 3B devices.

Return to Top

Click to return   

Partnership Form

Tell us more about yourself and your organization, and identify ways in which your partnership with us would yield high-impact results.


Successfully submitted. Thank you!


We will do our best to respond to you promptly.

This website uses cookies to enhance your experience.
By proceeding to browse this website, you agree to our privacy & cookies policy.


Got it!