The market in terms of VR creation tools is constantly expanding with more and more valuable offers and it may be difficult to understand different products to find which one would suite your project the best. To illustrate this point we will compare two looking alike products that you can use to create interactions for VR content: the VR Interaction Framework, a well-reviewed Unity Plugin known to be user friendly, and the Interaction Builder from the Interhaptics Suite.
Because they are similar in many points such as framework complexity, coding complexity, and what values they want to provide to developers. Both are built on 3 pillars: define an interactive/grabble object, define the body part/grabber, and events related to each interaction. Whether it is with the Interaction Builder or the VR Interaction Framework, your coding knowledge will not be under pressure to understand and create interactions. However, you will always have API/framework access to go more in-depth during your development process.
Of course, it would be lying to say that they are identical. Some features look similar but are not built the same way and for the same purpose. Some features are completely different.
Starting with the VR Interaction Framework, you will find an easy to use grabber/grabbable system. It is not optimized for precise hand interactions but will work well for any interaction based on grabbing. Also, as it is not focused on hands, there is a dissociation between “what is a hand” and “what can grab” which gives some flexibility. VR Interaction Framework is mainly using button inputs to trigger interactions and events during interactions. It is also compatible with Oculus Quest hand tracking (precisely the Pinch input to simulate the grab). Counterpart is the loss of consistency between hand tracking input and controllers’ input. The interactions are half logic-based and half physic-based. Interactions are triggered logically (with a specific input/value) but the behavior of the object is managed by the physics engine which requires a bit of extra preparation on the object itself with physics components such as joints.
On the other hand, you will find the Interaction Builder. It is focused on hand interactions on a detailed level, which means a better definition of palm and fingers to have precise interactions based on body parts. As it is working around hands themselves, the Interaction Builder is not using button inputs to trigger interactions or events, but a grab strength computed with the skeleton of the hand. It gives a realistic representation of a “grab” and makes these interactions work consistently with both hand tracking and controllers. Everything is computed in a logic engine. No complex physics component is required to create an interactive object (usually you would use physical joints and configure them).
Each interaction is triggered logically and the behavior of the object in the 3D space is also logically managed therefore you just need to apply script and the object will behave as it should.
If your project does not require fully realistic interactions and has more of a gaming approach, the VR Interaction Framework will easily help you in that way, specifically if you want to fully work in a physics-based environment. However, from its concept around the hand and a logical engine for stability, the Interaction Builder would be a default choice for any serious game, business, or professional training (e.g. any VR content requiring realistic hand interactions or VR content around Hand Tracking). Also, if you do not want to learn and spend time on the physics side of interactions, the Interaction builder will handle everything for you in a logical engine.