To start using Haptic Composer, follow these steps:
Download and install the Haptic Composer application from this link.
Open the Haptic Composer app on your PC.
Start creating your haptic effects.
The Haptic Composer is currently supported only for Windows OS.
The Haptic Composer is a tool used in the process of haptic design. It is a graphical interface that allows for the manipulation of .haps files, which store the haptic experiences. It is also possible to edit the files manually, due to their human-readable format.
The Haptic Composer follows a familiar design interface of other design software, and its functionalities can be found in the tutorial section of the documentation.
Audio to Haptics encoding
The Haptic Composer also includes the ability to import a wav audio file and extract haptic properties automatically. Once imported, you can select which properties to extract by clicking on the audio representation.
The notes in the Composer can be edited precisely in the note editor by double-clicking on them. The properties that can be edited within the Haptic Composer include keyframes, which can be dragged and dropped on the design space, and the side properties indicate the value of the chosen keyframes. A right-click on the design space will allow adding a keyframe or transient.
The Haptic Composer also includes several pre-designed notes, called Haptic Presets, that can be assembled to quickly create haptic experiences. The preset can be edited in the note editor. To include a Haptic Preset, simply click on the preset category and drag and drop the preset name on a haptic melody.
Haptics testing is fundamental to iterate rapidly the final gaming experience. Interhaptics provides rapid testing methods included in the products. Refer to this documentation.
Haptic effects are stored in .haps files, which are written in a JSON-based format. These files are edited using Haptic Composer and rendered by the Interhaptics Engine when used in an application or game. They are digital assets that describe the haptic signals to be rendered in response to specific events or interactions within the application. The .haps files are designed to be general and platform-agnostic, with the Interhaptics SDK managing the translation of the haptic experience to the chosen peripheral. These files can store multiple haptic perceptions, such as Vibration, Stiffness, and Texture, and they follow a hierarchical structure where all perceptions within the .haps file are organized in the same way.
Keyframes are a fundamental component of haptic effects that provide precise control over the haptic experience. Each keyframe includes the following information:
The position of the rendering space where the haptic property is to be modulated
The haptic property that will be modulated at that position
Currently, the haptic properties that can be modified in a .haps file are the amplitude and frequency of the haptic signal. These characteristics are interpreted differently based on the considered perception and frequency band. It’s important to note that the position and haptic properties defined in keyframes are used by the Interhaptics Engine to render the haptic experience in the application.
Haptic Notes are a grouping of key-frames that provide a convenient way to organize and manage haptic effects on the timeline. A Haptic Note is a self-contained haptic effect that can be easily identified and manipulated on the timeline. The blocks that you drag onto the timeline in the Haptic Composer are Haptic Notes.
Currently, there are two types of Haptic Notes supported: Transients and Haptic Effects. Transients are short-lived haptic effects, such as shocks, and are represented in yellow on the Haptic Composer. Haptic Effects, on the other hand, are haptic patterns that are represented in blue in the Haptic Composer. It is important to note that Haptic Notes can be used to create complex haptic experiences by combining multiple haptic effects and properties.
Haptic melodies contain a list of haptic effects on the timeline. The haptic effects cannot overlap.
Haptic melodies are mixed in the results with their own properties. If two notes define haptics at the same position on two different melodies, the haptics output will deliver both results at the same time. For example, a haptic effect with a high-frequency vibration will be mixed with a second with low-frequency vibration, and the final result will include two vibration patterns.
Not all devices are capable to deliver wide-band haptic effects represented in melodies, refer to the device integration documentation to discover the capabilities of each device.
Perceptions are the different types of haptic effects that can be created and rendered using Haptic Composer. The currently supported perceptions are Vibrations, Stiffness, and Texture:
Vibration: Vibrations haptics are a vibrotactile event that occur over time. .haps files store a wideband representation of haptic vibrations, which can be created using transients, effects, and multiple melodies.
Stiffness: Stiffness is a force that occurs in relation to displacement. .haps files store a normalized representation of the stiffness value, which can be adapted to the displacement of the device and the strength capability of the device. Stiffness can also be used to design adaptive trigger feedback for controllers, like the PlayStation®5 DualSense™ wireless controller. The application integration will determine the rate of execution of the haptic feedback.
Texture: Texture stores the representation of a vibrotactile haptic feedback in relation to space. Textures are rendered based on the displacement of the haptic device over a haptic-enabled surface in the application. Texture can be stored as haptics over a normalized length or in relation to a physical length. The application integration will determine the rate of rendering of the texture. An example of texture is a car driving over a terrain, the rate of rendering of the haptic pattern depends on the speed of the car.
It is important to note that the haptic perceptions are stored in the .haps files, and the Interhaptics SDK manages the translation of the haptic experience to the chosen peripheral.
Audio To Haptics Settings
To design haptics from an audio track, you just have to import the audio file (WAV type) and hit convert to a melody. Haptic Composer has the capability to fine tune the filter extracting the haptic effect. The Audio-to-Haptics Settings are explained in the following table:
allows tuning the trigger amplitude for the extraction of transients
density of amplitude keyframes
density of frequency keyframes
changes the passband filter extracting the haptics characteristics
We usually suggest starting with one device in mind for the haptic experience and tweaking the frequency range toward it. For example:
iPhone: 65-300 Hz
DualSense™: 20-500 Hz
You can test how it plays on the different supported devices in the Haptic Composer just by selecting another device from the drop-down menu in the Player Settings panel. Testing on other peripherals is currently supported at the game engine level. We suggest designing with the DualSense™ in Haptic Composer, things usually play well on other platforms thanks to the Interhaptics transcoding layer.