haps File Format
Haptic materials are stored in .haps files, which are a JSON-based format. These files are edited using Haptic Composer and rendered by the Interhaptics Engine when embedded in an application or game. They are digital assets that describe the haptic signals to be rendered in response to specific events or interactions within the application. The .haps files are designed to be general and platform-agnostic, with the Interhaptics SDK managing the translation of the haptic experience to the chosen peripheral. These files can store multiple haptic perceptions, such as Vibration, Stiffness, and Texture, and they follow a hierarchical structure where all perceptions within the haps file are organized in the same way.
Keyframes are a fundamental component of haptic materials that provide precise control over the haptic experience. Each keyframe includes the following information:
- The position of the rendering space where the haptic property is to be modulated
- The haptic property that will be modulated at that position
Currently, the haptic properties that can be modified in a .haps file are the amplitude and frequency of the haptic signal. These characteristics are interpreted differently based on the considered perception and frequency band. It’s important to note that the position and haptic properties defined in keyframes are used by the Interhaptics Engine to render the haptic experience in the application.
Haptic Notes are a grouping of keyframes that provide a convenient way to organize and manage haptic effects on the timeline. A Haptic Note is a self-contained haptic effect that can be easily identified and manipulated on the timeline. The blocks that you drag onto the timeline in the Haptic Composer are Haptic Notes.
Currently, there are two types of Haptic Notes supported: Transients and Haptic Effects. Transients are short-lived haptic effects, such as shocks, and are represented in yellow on the Haptic Composer. Haptic Effects, on the other hand, are haptic patterns that are represented in blue in the Haptic Composer. It is important to note that Haptic Notes can be used to create complex haptic experiences by combining multiple haptic effects and properties.
Haptic melodies contain a list of haptic effects on the timeline. The haptic effect cannot overlap.
Haptic melodies are mixed in the results with their own properties. If two notes define haptics at the same position on two different melodies, the haptics output will deliver both results at the same time. For example, a haptic effect with a high-frequency vibration will be mixed with a second with low-frequency vibration, and the final result will include two vibration patterns.
Not all devices are capable to deliver wideband haptic effects represented in melodies, refer to the device integration documentation to discover the capabilities of each device.
Perceptions are the different types of haptic effects that can be created and rendered using Haptic Composer. The currently supported perceptions are Vibrations, Stiffness, and Texture:
- Vibration: Vibrations haptics are a vibrotactile event that occur over time. .haps files store a wideband representation of haptic vibrations, which can be created using transients, effects, and multiple melodies.
- Stiffness: Stiffness is a force that occurs in relation to displacement. .haps files store a normalized representation of the stiffness value, which can be adapted to the displacement of the device and the strength capability of the device. Stiffness can also be used to design adaptive trigger feedback for controllers, like the PlayStation®5 DualSense ™ wireless controller. The application integration will determine the rate of execution of the haptic feedback.
- Texture: Texture stores the representation of a vibrotactile haptic feedback in relation to space. Textures are rendered based on the displacement of the haptic device over a haptic-enabled surface in the application. Texture can be stored as haptics over a normalized length or in relation to a physical length. The application integration will determine the rate of rendering of the texture. An example of texture is a car driving over a terrain, the rate of rendering of the haptic pattern depends on the speed of the car.
It is important to note that the haptic perceptions are stored in the .haps files, and the Interhaptics SDK manages the translation of the haptic experience to the chosen peripheral.