The Interhaptics engine renders real-time high-fidelity Haptics and 3D interactions for XR and mobile.
The Interhaptics engine is a real-time 3D interaction and haptics rendering process.
Here are some features:
Interhaptics takes inspiration from the human perception of the sense of touch. There are 20-25 discernible sensations that humans can perceive with the touch sense, existing Haptics devices can render convincingly just 4 of them: Vibrations, Textures, Stiffnesses, and Temperatures. Each of these perceptions has one solid perceptual characteristic on its basis. We extracted and modeled them:
2 of them, Textures and Stiffnesses are perceptually encoded in space (faster you move, faster the sensation happens), and 2 of them are encoded in time. Let's get the example of a button. The sensation we perceive when we touch it is as a function of how much we press, then a click happens and the force suddenly drops down:
This behavior is intrinsic to how humans perceive touch sensations. All of these algorithms and concepts are included within the Interhaptics engine and they are computed in real-time to give the best possible haptics experience to users.
There is a well-known problem in the field of haptics referring to signal processing theory: the data acquisition rate of the system determines the
accuracy of your processed signal. Position acquisition is fundamental to render textures and stiffnesses, and XR and mobile systems have position
acquisition rates up to 120 Hz. Where is the catch? Haptics, to work properly, requires rendering frequencies in the 1 kHz region. We encounter
here the well-known problem of under-sampling: you can have the best haptics actuator on the planet, but if you cannot control it fast enough, it
is pretty useless.
(see https://link.springer.com/chapter/10.1007/978-3-319-42321-0_23
and https://dl.acm.org/doi/abs/10.1145/3025453.3026010 for
broader academic discussion of the problem and case specific solutions)
Interhaptics solves this problem in a general manner. We included a model of the human motor behavior and haptics perception within the Interhaptics engine, and we use this model to hyper sample the human position and give a stable haptics perception for spatial and time-based haptics.
This allows Interhaptics to create a perceptually accurate haptics rendering, far above the ability of the tracking system to acquire the position. The method works because we considered touch perception and the limit of human motor behavior within the rendering algorithm.
One of the issues while deploying your 3D application is how to manage the different inputs and outputs on various platforms. There are several kinds of inputs: Hands, controllers, joypads, buttons, or even touch screens for an AR application. Fundamentally, these interfaces do the same job during interactions: define an input, execute a spatial transformation, and meet an ending condition.
While being in larger numbers compared with 2D gestures such as drag and drop, or pinch and zoom, 3D interactions can be categorized in pre-set of useful spatial transformation for human Human-Machine Interaction. We categorized a large set of 3D interaction primitives covering much of the hand and finger input used in today's 3D applications. We encoded their behavior within the Interhaptics engine for a faster development process.
Strategy | Description | Use Cases |
---|---|---|
Free Movement | The object moves freely in rotation and position. | Useful for most common object that you will grab or pinch in the environment (tool, components, pen, mug, etc.) |
Rotation around pivot* | The object movements are constrained to a rotation around an axis. | Everything like doors, levers, valves, wheels. |
Sliding* | The object movements are constrained to follow a specific axis. | Obviously, great value for any sliding objects (door, box cap, UI slider, drawers, buttons… |
Free + fix rotation* Free + Y rotation |
Modified version of the free movement to apply constraints on specific rotations. | Mostly used for big objects that you want to move freely. Having fixed rotations allows to avoid big objects spinning / rotating fast and confusing the user. |
*More spatial constraints can be applied to define axis of rotation/sliding as well as limits of rotations and sliding.
All strategies can be triggered by contact from hand or fingers, grab or pinch.
These interaction primitives work pretty much like drag and drop, or pinch and zoom algorithms implemented in touch panels in your
smartphone. The Interhaptics engine allows transporting these interaction primitives on any platform, without rewriting any code.
This approach turbocharges the cross-compatibility of your interactive application between several platforms, with no code to be done to port any application!
Last but not the least, is how do you include haptics into your applications?
There are 2 separate ways to historically include haptics into applications. One is based on physics simulation of contact and forces (touch
and object, feel haptics feedback), and the other is based on feedback generated following events: click a button, feel predetermined feedback for confirmation.
Interhaptics allows both methods of development. You can set your stiffness to be rendered on the contact with an object, as well as deliver a specific vibration pattern while an event happens.
One more thing.
Once accounted for spatial haptics and 3D interactions in the same engine, we unlock a third method of including haptics within your application: Haptics as a function of interactions.
This third method gives application developers large freedom to conceptually create satisfactory and engaging interactions and haptic experiences in few clicks.
All these concepts together make the Interhaptics engine a unique piece of technology providing a transparent interface for application developers and final users. You can scale the great experience you designed for your final users and reach them on 3B devices.