• Haptics Composer

  • Interaction Builder

  • Interhaptics Engine

Interhaptics is a development suite designed to build and create 3D interactions and haptic feedback for applications in VR and AR.




A Standalone graphical tool developed to design, test and iterate your haptic experience.

A No Code plug-in used to easily design, develop and implement complex 3D interactions in VR and AR content.A dedicated real time engine to render haptics and 3D interactions.

1. Features

No Code:  Interhaptics uses a simple drag - and - drop method to compose and assemble a synergy of 3D interactions and haptic feedbacks.

I/O Hardware independence: Interhaptics guarantees a hardware independency of VR and AR content for all controllers, hand tracking and haptic devices.

Development Platforms Independence: Interhaptics is integrated with Unity Real - Time Development platform. Interhaptics is compatible with any other real time 3D engine.

Streamlined Development: We created a fully customizable engine to develop 3D interactions and haptic feedback in a few clicks for VR and AR applications.

3D Interactions & Haptics Independence: Create a large spectrum of VR and AR experiences. Compose and develop 3D interactions and haptic feedbacks independently with Interhaptics !

Modular Dev Tools: Interhaptics is composed of modular developing blocks. The INTERACTION BUILDER for 3D interactions, HAPTICS COMPOSER for haptics sensations, all powered by the INTERHAPTICS ENGINE to render 3D interactions and haptic feedbacks.

Whether you develop as a hobby, your full-time job, if you own small business or a developing corporation, Interhaptics gives your scenario realistic interactions and haptic feedbacks increasing user immersion.

2. Integration

Want to use Interhaptics in your game engine?

Import the package to your project and you are ready to go.

Want to test your project with other devices?

Within 2 clicks away, you can switch between your gear.

You created the perfect haptic material for your experience?

Just drag and drop in your project.

3. Examples

Case 1: Free object 

The object will follow the translation and rotation of your hand when you grasp it. But to release it you need to place the object inside a specific area.  


Case 2: Double grasping 

As previously the object will follow the translation and rotation of your hands. The specificity here is to grab the object with two hands to trigger the interaction. 


Case 3: Free position with fix rotation 

This object will follow the grabbing hand only in translation.  


Case 4: Free position with Y rotation 

In this case the object will follow the grabbing hand in translation and in its local Y axis for the rotation. When the object is release, it teleports itself back to its original position. 


Case 5: Slider 

The slider will follow the hand in an interval of value. It can be released only in a subinterval of this first one.  


Case 6: Handler 

The interaction will be triggered by a simple touch on the extremity of the object. It will rotate around the circle center by following its touching finger.  


Case 7: Lever 

The lever rotates around a pivot. It can’t be released except by reaching one of its constraint. 


Case 8: Local button 

This object is a sliding button computed into a local space. Like that it can work even if you move its base in translation or in rotation. 

This website uses cookies to enhance your experience. By proceeding to browse this website, you agree to our privacy & cookies policy.

Got it!