Interaction Builder

This low code plugin for 3D engines allows you to develop realistic hand interactions within 3 clicks. The application increases the value of your virtual creations while significantly decreasing the time of development. Best used for hand-tracking and UX.

Setup and Installation

Follow this setup tutorial to perfectly install the Interaction Builder.


Supported Game Engines
  • Unity 2019.4+ until 2020.2

Supported OS
  • Windows 10 version 1903 and more

Planned Compatibilities
  • Unreal Engine


  • Game Engine plugin based on the Interhaptics Engine to easily build complex interactions with haptic feedback.
  • Low/No coding required.
  • The Interaction builder provides scripts that you can apply to your objects and build a complex interaction system without even writing a single line of code.


HapticInteractionObject Script

The HapticInteractionObject component is the script that defines that an object will be interactable. In this script, you can manage events linked to interactions, constraints, or other triggers.

The first part (and the most important of an object setup is the interaction settings.

The Interaction Primitive needed for an object is the configuration of your interaction.

Interaction Settings

Interaction PrimitiveThe config file for the global behavior of your interaction. Cf. InteractionPrimitive.
Block object on Interaction Start
Block the object when the interaction starts.
Block body part on Interaction Start
Block the body part interacting with the object.
Set As Kinematic during interaction
Set the rigid body of the object as kinematic during the interaction. Goes back to its previous state when the interaction finishes.
Compute on local basic
This is used to locally compute interaction if the parent of the object is also moving in space (be careful with distance and scales if you set the option to true).

Interaction Events
Trigger events in the virtual world when the user starts or finishes an interaction with the objects.
Constraints Events
Some object behaviors (cf. InteractionPrimitive script) have a maximum and/or minimum constraint to their movement in space.
Here you can trigger events if one of the constraints is reached or if the object is moving between them.
Numerical Target
Following the same principle as the constraints, you can trigger events based on a specific value that has to be reached by the object ( e.g. the slider button of a radio that will switch channel depending on the current frequency selected)
Physical Target
Because your object is moving in space, you can also trigger events if your object collides with another specified object in the scene.

InteractionPrimitive Script

To configure the previous script, you need to use an InteractionPrimitive. This object regroups all the parameters and data required for the behavior of an interaction. You can design one interaction primitive and apply them to multiple objects for them to have the same behavior (eg. doors, levers, etc).

In the Interaction Settings part, you configure the trigger event used to perform the interaction, the body part to use (if you need a specific one like your right index, or generic ones like one finger, or your hands), and the interaction strategy which will define how your object will move in the space when the user will interact (some strategies need specific settings so don’t forget to fill them).

Interaction Strategy: this is the main parameter for how the interaction should be rendered. It will define the global shape of your behavior, on which you can apply other constraints to fit your needs, from the following list:

StrategyDescriptionExampleCan have constraints
FreeThe object will follow the hand and its rotation
Common objects like tools
Free with fixed rotationThe object will follow the hand but its rotation will be fixed.
Objects you want to always face the same direction such as a shield, a round table you want to move in a room, a keyboard on a desk
Free with Y rotation
The object will follow the hand but will only rotate on its Y-axis.
A vertical staff/bar, a screwdriver for a specific use case.
Rotation around pivot
The object will rotate around a given pivot in space. The object will always face the pivot.
Door, crank, wheel, valve
SlidingThe object will move linearly following a specified axis.
Drawers, sliding door/button, a fixed circular saw on a working bench
ButtonSimilar to the sliding interactions.
Any button

Interaction Trigger: Decide which gesture is required to trigger the beginning of the Interaction. The gesture recognition is computed by the Interhaptics Engine.

ContactThe body part will be only required to be in contact with the object to trigger the interaction.
GraspThe hand must be in a Grasp position to trigger the interaction with the object.
PinchThe hand must be in a Pinch position to trigger the interaction with the object.
Pinch or Grasp

The hand must be in a Pinch or Grasp position to trigger the interaction with the object.

Body Part: set which body part should be in contact with the object to be able to interact with the object. The Interhaptics Engine is real bones based, and the Interaction Builder uses this to have multiple body part conditions. You must apply the HapticInteractionBodyPart script to an object in the scene and set to which body part it corresponds:

Body partDescription
HandThe body part corresponding to any hand is required to trigger the interaction
Left HandSpecify that the left hand is required
Right HandSpecify that the right hand is required
Two HandsBoth hands are required to trigger the interaction. Both hands must perform the same gesture requirement.
One FingerThe body part corresponding to any finger is required to trigger the interaction.

One-hand switch: Only available with the "two hands" or the "two hands and heads" interactions.

If true, the player can trigger the interaction with one hand and two hands depending on how many hands are interacting with the object. Otherwise, the players need both hands in order to trigger the interaction.

Constraints settings

If your strategy enables it, you have the possibility to add constraints to the movement of the object.

For sliders and buttons, the float value corresponds to a world distance from the origin world position of the object.

For rotations around a pivot, the float value corresponds to a world rotation in degrees. It handles rotations above 360° (e.g. if you want a wheel to be stuck after 3 rotations, you can put a 1080° maximal constraint, and then the wheel will not be able to turn more than 1080° around its pivot).

Haptic Interaction Settings

One of the specific features of the Interaction Builder is to work in synergy with the Interhaptics Engine to render haptic feedback based on the interaction you are performing.

Example to integrate a haptic material into an interaction

It uses the same material that you can create using the Haptics Composer from the Interhaptics Suite. Drag and drop a haptic material inside your primitive and adjust the parameters to fit your needs. That means you can create a tailored haptic material specifically for interaction to have the best immersion possible.

Use Custom Haptic Material During Interactions: because an object can already have a haptic material applied to it, thanks to the Interhaptics Engine script, you either keep the same material for your interaction or override it with a different one.

Haptic Interaction Feedback: the haptic material that you want to use.

Rendering: check the perceptions you want to render from the material. You can filter which one you want to keep or remove.

Stiffness evaluation mode: For the stiffness, you have the choice to :

  • Render a constant value for the entire interaction. You don’t need any material linked to the primitive.
  • Follow the stiffness of the material based on movement. It requires a strategy that enables constraints.

Texture: like the stiffness, the texture is computed based on movements.

Vibration Evaluation Mode: you can choose to start the computation of the Vibration from the object creation or from the beginning of the interaction (e.g. a vibrating car will be vibrating even if you don’t touch it, but other objects can have a short vibration only when you grab them/touch them).

Haptic Metric: is used to define the scale used to read the material (it can differ from the scale of the material itself). It is used to compute the haptic feedback depending on the position.

Degree Value Corresponding: is used to translate a distance into degrees to compute the haptic feedback for rotation around pivot strategy. It transforms 1 unit of the haptic metric into the set value.

HapticInteractionBodyPart Script

To trigger interactions, you need to link game objects to specific body parts.

Interaction Configuration

Body part: Set the specific body you want this game object to represent. This will be interfaced with the InteractionPrimitive to know if an interaction should be triggered.          

The list of the body part you can assign is:

  • Left hand / Right hand
  • All left-hand fingers individually
  • All right-hand fingers individually
  • The head of your character

Custom Point To Follow: Use this parameter if you want the actual position of your body part to be different from the position of your game object (e.g. if the transform of your hand is on the wrist but you want the hand collider to be placed in the palm).

Bodypart Physics

Specify here the radius and the offset of the sphere collider corresponding to the body part. This collider will determine collisions and therefore if an interaction should be triggered. The position is defined based on the game object position or the custom point to follow if you set one.

Editor Integration

The Editor Integration section is here to help you with the physic configuration by showing in the editor the space corresponding to the collider.

InteractionBuilderManager Script

The InteractionBuilderManager is required to manage all your body parts and their interactions with any HapticsInteraction object. If it is not in the scene, the interaction system will not work.


Set here the game objects corresponding to your hands in the scene.

Interaction settings

Stop Double Grasping: when you are in a Two Hands interaction, both of your hands have to check the conditions to trigger the interaction (e.g. both hands in grasp gesture). By checking this box, you can decide that the interaction is stopped if one hand leaves the condition state. If you uncheck it, the interaction will continue until both hands leave the condition state.

You can replace the installed configuration with your own if you want.

Notice that if the Interaction Builder can’t load any configuration file on the actual computer, it will take the configuration set in this script.

Premade RIG

Interaction Builder includes a premade rig with all the necessary scripts to start building your scene. It can be found in Interhaptics > Interaction Builder > Instantiate IBRIG.

This rig contains a VR camera, two animated hands with HapticsBodyPart, and HapticInteractionBodypart on each hand and each finger. It also comes with the InteractionBuilderManager script to manage interactions, HapticManager script to manage haptic rendering outside interactions, and Generic Hand Tracking script to track the gesture of your hands depending on your input.

API Call

You can find below a little list of public methods provided by objects from the module.


Return the configured right hand.
Return the configured left hand.
This method will block interactions on an object. It will return true if it succeeded and false otherwise.
This method will unblock interactions on an object. It will return true if it succeeded and false otherwise.
This method will try to block interactions on an object.
This method will try to unblock interactions on an object.
ForceStartInteraction(HapticInteractionObject, BodyPartInteractionStrategy)
This method will unblock body parts and the object and start the interaction after.
ForceFinishInteraction(HapticInteractionObject, BodyPartInteractionStrategy)
This method will unblock body parts and the object and finish the interaction after.

This method converts you a body part into the correct BodyPartInteractionStrategy value.


EvaluateHapticAmplitude(SOHapticMaterial, Vector3, float, bool)
These methods return the haptic amplitude of the material linked to the primitive. The value is a normalized float computed with your distance in space, a time value and a Boolean if you want to compute vibrations or not.
EvaluateHapticAmplitude(SOHapticMaterial, float, float, bool)



This method will change block interactions if b is true and unblock if it's false.

The method will return true if the value was updated and false otherwise.

boolStart an interaction between the object and the body part.
It returns true if the interaction was started and false otherwise.

Finish an interaction between the object and the body part.

It returns true if the interaction was finished and false otherwise.

Force to start interaction between the object and the wanted body part.
voidForce to finish interaction between the object and the wanted body part.
SnapTo(Vector3, Quaternion)
voidThese methods allow you to set a specific position/rotation to your object.
voidThis method sets a new value for the stiffness feedback of the primitive of your object. It will ignore value outside [0;1] interval.
voidIt will start the vibration rendering of your primitive based on its settings.
voidIt will stop the vibration rendering of your primitive.
voidThis method resets the vibration feedback of your object to its beginning state.

This method returns the haptic amplitude of the material linked to the primitive.

The value is a normalized float computed with the given haptic material.


This method returns you the distance between the actual object position and its origin position.

For rotationAroundPivot strategies, this function will return the complete degree value performed.

Object Snapper


To understand how the Object Snapper works on the scene we focus on four scripts:

ASnappableActorAbstract class which represents the Actor to snap to an object (example: a hand).
SnappingDataIt's a class inherited from Unity’s ScriptableObject which will contain the Pose of each ASnappableActor that you set.


It's an anchor on which the ASnappableAnchor will be snapped. (example: gun grip, gun canon).
SnappingObjectRepresents the bridge between the SnappingPrimitives and the ASnappableActor (example: a gun). 


On this script, there are two Booleans:
  • Fixed ASnappableActor interaction: if false, the snapping will be dynamic (for example, if you’re grabbing the steering wheel, your hand will be able to rotate around the torus’ tube during the snapping phase).
  • Automatically find the nearest: if True, if the SnappingObject GameObject contains multiple SnappingPrimitive(on it or its children), then if it find a nearer SnappingPrimitive from the ASnappableActor than the current one, the hand will be automatically snapped to the new SnappingPrimitive.

  • Primitive shape: The geometric shape that your ASnappableActor will follow.
  • Local position: The local position of the geometric shape from the GameObject
  • Local rotation: The local rotation of the geometric shape from the GameObject
  • SnappingData: SnappingPrimitive’s snappingData
  • Movement Type: The tracking movement that the ASnappableActor will follow. For example, if the SnappingObject has a sphere Primitive shape and the movement type is set to Rotation, you will have to rotate your controller/hand to rotate the ASnappableActor.

If you want to delete a specific Pose, select one of the ScriptableData that you have created before. A list of the saved Poses will appear with the possibility to delete them.

Packages Associated

The Interaction Builder is provided with the Hand tracking for VR and MR package as well as the Academics Package.

Tutorials and demos

Check out the Interaction Builder tutorials to make sure you start your project in the best conditions.

Partnership Form

Tell us more about yourself and your organization, and identify ways in which your partnership with us would yield high-impact results.

Successfully submitted. Thank you!

We will do our best to respond to you promptly.

This website uses cookies to enhance your experience.
By proceeding to browse this website, you agree to our privacy & cookies policy.

Got it!