CREATE COMPLEX INTERACTIONS FOR ALL XR APPLICATIONS
This low code plugin for 3D engines allows you to develop realistic hand interactions within 3 clicks. The application increases the value of your virtual creations while significantly decreasing the time of development. Best used for hand-tracking and UX.
Setup and Installation
Follow this setup tutorial to perfectly install the Interaction Builder.
|Supported Game Engines|
- Game Engine plugin based on the Interhaptics Engine to easily build complex interactions with haptic feedback.
- Low/No coding required.
- The Interaction builder provides scripts that you can apply to your objects and build a complex interaction system without even writing a single line of code.
The HapticInteractionObject component is the script that defines that an object will be interactable. In this script, you can manage events linked to interactions, constraints, or other triggers.
The first part (and the most important of an object setup is the interaction settings.
The Interaction Primitive needed for an object is the configuration of your interaction.
|Interaction Primitive||The config file for the global behavior of your interaction. Cf. InteractionPrimitive.|
|Block object on Interaction Start||Block the object when the interaction starts.|
|Block body part on Interaction Start||Block the body part interacting with the object.|
|Set As Kinematic during interaction||Set the rigid body of the object as kinematic during the interaction. Goes back to its previous state when the interaction finishes.|
|Compute on local basic||This is used to locally compute interaction if the parent of the object is also moving in space (be careful with distance and scales if you set the option to true).|
|Interaction Events||Trigger events in the virtual world when the user starts or finishes an interaction with the objects.|
|Constraints Events||Some object behaviors (cf. InteractionPrimitive script) have a maximum and/or minimum constraint to their movement in space. |
Here you can trigger events if one of the constraints is reached or if the object is moving between them.
|Numerical Target||Following the same principle as the constraints, you can trigger events based on a specific value that has to be reached by the object ( e.g. the slider button of a radio that will switch channel depending on the current frequency selected)|
|Physical Target||Because your object is moving in space, you can also trigger events if your object collides with another specified object in the scene.|
To configure the previous script, you need to use an InteractionPrimitive. This object regroups all the parameters and data required for the behavior of an interaction. You can design one interaction primitive and apply them to multiple objects for them to have the same behavior (eg. doors, levers, etc).
In the Interaction Settings part, you configure the trigger event used to perform the interaction, the body part to use (if you need a specific one like your right index, or generic ones like one finger, or your hands), and the interaction strategy which will define how your object will move in the space when the user will interact (some strategies need specific settings so don’t forget to fill them).
Interaction Strategy: this is the main parameter for how the interaction should be rendered. It will define the global shape of your behavior, on which you can apply other constraints to fit your needs, from the following list:
|Strategy||Description||Example||Can have constraints|
|Free||The object will follow the hand and its rotation||Common objects like tools||No|
|Free with fixed rotation||The object will follow the hand but its rotation will be fixed.||Objects you want to always face the same direction such as a shield, a round table you want to move in a room, a keyboard on a desk||No|
|Free with Y rotation||The object will follow the hand but will only rotate on its Y-axis.||A vertical staff/bar, a screwdriver for a specific use case.||No|
|Rotation around pivot||The object will rotate around a given pivot in space. The object will always face the pivot.||Door, crank, wheel, valve||Yes|
|Sliding||The object will move linearly following a specified axis.||Drawers, sliding door/button, a fixed circular saw on a working bench||Yes|
|Button||Similar to the sliding interactions.||Any button||Yes|
Interaction Trigger: Decide which gesture is required to trigger the beginning of the Interaction. The gesture recognition is computed by the Interhaptics Engine.
|Contact||The body part will be only required to be in contact with the object to trigger the interaction.|
|Grasp||The hand must be in a Grasp position to trigger the interaction with the object.|
|Pinch||The hand must be in a Pinch position to trigger the interaction with the object.|
|Pinch or Grasp|
The hand must be in a Pinch or Grasp position to trigger the interaction with the object.
Body Part: set which body part should be in contact with the object to be able to interact with the object. The Interhaptics Engine is real bones based, and the Interaction Builder uses this to have multiple body part conditions. You must apply the HapticInteractionBodyPart script to an object in the scene and set to which body part it corresponds:
|Hand||The body part corresponding to any hand is required to trigger the interaction|
|Left Hand||Specify that the left hand is required|
|Right Hand||Specify that the right hand is required|
|Two Hands||Both hands are required to trigger the interaction. Both hands must perform the same gesture requirement.|
|One Finger||The body part corresponding to any finger is required to trigger the interaction.|
If your strategy enables it, you have the possibility to add constraints to the movement of the object.
For sliders and buttons, the float value corresponds to a world distance from the origin world position of the object.
For rotations around a pivot, the float value corresponds to a world rotation in degrees. It handles rotations above 360° (e.g. if you want a wheel to be stuck after 3 rotations, you can put a 1080° maximal constraint, and then the wheel will not be able to turn more than 1080° around its pivot).
Haptic Interaction Settings
One of the specific features of the Interaction Builder is to work in synergy with the Interhaptics Engine to render haptic feedback based on the interaction you are performing.
Example to integrate a haptic material into an interaction
It uses the same material that you can create using the Haptics Composer from the Interhaptics Suite. Drag and drop a haptic material inside your primitive and adjust the parameters to fit your needs. That means you can create a tailored haptic material specifically for interaction to have the best immersion possible.
Use Custom Haptic Material During Interactions: because an object can already have a haptic material applied to it, thanks to the Interhaptics Engine script, you either keep the same material for your interaction or override it with a different one.
Haptic Interaction Feedback: the haptic material that you want to use.
Rendering: check the perceptions you want to render from the material. You can filter which one you want to keep or remove.
Stiffness evaluation mode: For the stiffness, you have the choice to :
- Render a constant value for the entire interaction. You don’t need any material linked to the primitive.
- Follow the stiffness of the material based on movement. It requires a strategy that enables constraints.
Texture: like the stiffness, the texture is computed based on movements.
Vibration Evaluation Mode: you can choose to start the computation of the Vibration from the object creation or from the beginning of the interaction (e.g. a vibrating car will be vibrating even if you don’t touch it, but other objects can have a short vibration only when you grab them/touch them).
Haptic Metric: is used to define the scale used to read the material (it can differ from the scale of the material itself). It is used to compute the haptic feedback depending on the position.
Degree Value Corresponding: is used to translate a distance into degrees to compute the haptic feedback for rotation around pivot strategy. It transforms 1 unit of the haptic metric into the set value.
To trigger interactions, you need to link game objects to specific body parts.
Body part: Set the specific body you want this game object to represent. This will be interfaced with the InteractionPrimitive to know if an interaction should be triggered.
The list of the body part you can assign is:
- Left hand / Right hand
- All left-hand fingers individually
- All right-hand fingers individually
- The head of your character
Custom Point To Follow: Use this parameter if you want the actual position of your body part to be different from the position of your game object (e.g. if the transform of your hand is on the wrist but you want the hand collider to be placed in the palm).
Specify here the radius and the offset of the sphere collider corresponding to the body part. This collider will determine collisions and therefore if an interaction should be triggered. The position is defined based on the game object position or the custom point to follow if you set one.
The Editor Integration section is here to help you with the physic configuration by showing in the editor the space corresponding to the collider.
The InteractionBuilderManager is required to manage all your body parts and their interactions with any HapticsInteraction object. If it is not in the scene, the interaction system will not work.
Set here the game objects corresponding to your hands in the scene.
Stop Double Grasping: when you are in a Two Hands interaction, both of your hands have to check the conditions to trigger the interaction (e.g. both hands in grasp gesture). By checking this box, you can decide that the interaction is stopped if one hand leaves the condition state. If you uncheck it, the interaction will continue until both hands leave the condition state.
You can replace the installed configuration with your own if you want.
Notice that if the Interaction Builder can’t load any configuration file on the actual computer, it will take the configuration set in this script.
Interaction Builder includes a premade rig with all the necessary scripts to start building your scene. It can be found in Interhaptics > Interaction Builder > Instantiate IBRIG.
This rig contains a VR camera, two animated hands with HapticsBodyPart, and HapticInteractionBodypart on each hand and each finger. It also comes with the InteractionBuilderManager script to manage interactions, HapticManager script to manage haptic rendering outside interactions, and Generic Hand Tracking script to track the gesture of your hands depending on your input.
|RightHand()||AHapticInteractionBodyPart||Return the configured right hand.|
|LeftHand()||AHapticInteractionBodyPart||Return the configured left hand.|
|BlockObject(HapticInteractionObject)||bool||This method will block interactions on an object. It will return true if it succeeded and false otherwise.|
|UnlockObject(HapticInteraction)||bool||This method will unblock interactions on an object. It will return true if it succeeded and false otherwise.|
|TryToBlockObject(HapticInteractionObject)||void||This method will try to block interactions on an object.|
|TryToUnblockObject(HapticInteractionObject)||void||This method will try to unblock interactions on an object.|
|ForceStartInteraction(HapticInteractionObject, BodyPartInteractionStrategy)||void||This method will unblock body parts and the object and start the interaction after.|
|ForceFinishInteraction(HapticInteractionObject, BodyPartInteractionStrategy)||void||This method will unblock body parts and the object and finish the interaction after.|
This method converts you a body part into the correct BodyPartInteractionStrategy value.
|EvaluateHapticAmplitude(SOHapticMaterial, Vector3, float, bool)||float||These methods return the haptic amplitude of the material linked to the primitive. The value is a normalized float computed with your distance in space, a time value and a Boolean if you want to compute vibrations or not.|
|EvaluateHapticAmplitude(SOHapticMaterial, float, float, bool)||float|
This method will change block interactions if b is true and unblock if it's false.
The method will return true if the value was updated and false otherwise.
|StartInteraction(BodyPartInteractionStrategy)||bool||Start an interaction between the object and the body part.|
It returns true if the interaction was started and false otherwise.
Finish an interaction between the object and the body part.
It returns true if the interaction was finished and false otherwise.
|ForceStartInteraction()||void||Force to start interaction between the object and the wanted body part.|
|ForceFinishInteraction()||void||Force to finish interaction between the object and the wanted body part.|
|SnapTo(Vector3, Quaternion)||void||These methods allow you to set a specific position/rotation to your object.|
|SetStiffness(float)||void||This method sets a new value for the stiffness feedback of the primitive of your object. It will ignore value outside [0;1] interval.|
|PlayVibration()||void||It will start the vibration rendering of your primitive based on its settings.|
|StopVibration()||void||It will stop the vibration rendering of your primitive.|
|ResetVibration()||void||This method resets the vibration feedback of your object to its beginning state.|
This method returns the haptic amplitude of the material linked to the primitive.
The value is a normalized float computed with the given haptic material.
This method returns you the distance between the actual object position and its origin position.
For rotationAroundPivot strategies, this function will return the complete degree value performed.
To understand how the Object Snapper works on the scene we focus on four scripts:
Tutorials and demos