Menu

Official
Interhaptics
Blog

Hand Tracking for XR: Natural inspired UX and input | Official Blog | Interhaptics - Haptics and Interactions for XR

Hand Tracking for XR: Natural inspired UX and input

April 8th, 2020  |  by Admin

General Handtracking

After speaking of haptics design and Interaction Building, let’s talk about one of the newest and most powerful technology for XR: Hand tracking and its input mechanisms.

Hand tracking system are bringing a new level of immersion and user experience into extended reality applications. In few words, the user sees a button, it reaches out and it clicks with their fingertips. Everything is easy there! Super user experience, little friction of usage and excellent immersion.

This feature is extremely compelling that almost every major headset maker is implementing these technologies to their headsets.

headsets_using_handtracking_technology_interhaptics_png
Figure 1

Hand tracking works undoubtedly well because the user does not have to learn anything while interacting with virtual object. The user sees an object and reacts exactly as they would IRL (In Real Life). A ball on the table? Reach out and grab it!

It can be argued that the development of a great intelligence as human spaces is tied to the ability to create and manipulate new tools and objects. When we look in detail on how we, as humans, grasp and manipulate objects we come out with a map explained in The GRASP Taxonomy of Human Grasp Types 1 illustrated in Figure 2 .

T. Feix, J. Romero, H. Schmiedmayer, A. M. Dollar and D. Kragic, “The GRASP Taxonomy of Human Grasp Types,” in IEEE Transact
Figure 2 : T. Feix, J. Romero, H. Schmiedmayer, A. M. Dollar and D. Kragic, The GRASP Taxonomy of Human Grasp Types, in IEEE Transactions on Human-Machine Systems, vol. 46, no. 1, pp. 66–77, Feb. 2016.

This subdivision is fundamentally split in two major categories: Power and Precision manipulation as a function of how we control the position of the object. Finger pad for precision manipulation and related to the hand palm for power manipulation.

What we see in this division is the radically different way how we manipulate different objects. How the finger is arranged and how the pre-manipulation action is made to accommodate the final position.

pinch_vs_grab_interhaptics_gif
Figure 3

During the last several years working on customers projects using hand tracking technologies, we quickly realized that a unique pinching and grabbing input were not responding to the necessities of smooth user experience and generating quite a significant amount of frustration on the final user.

Pinching to move a large object is not at all expected by the user!

In some cases, we were forced to give completely up an immersive behaviour of the environment because the input actions were not consistent, and users were not fundamentally able to use the product. That means that we had a non-consistent user expectation across the product, some interactions were realistic, some of them had a high level of abstraction. This choice was not driven by user preference, but by an inconsistent response of the technology not taking into considerations human factors.

Figure 4

The mission of Interhaptics is to give consistent, realistic, and immersive development tools for XR creators to increase the value of the final product. For this reason, we developed in partnership with several research expert grabbing algorithms inspired the human behaviour for hand tracking technology. The objective of these algorithms is to give the best UX possible while using hand tracking technologies.

We segmented the real hand behaviour, extracted the core parameters, and transcribed it VR to create a consistent tool for hand tracking!

This will be available for any hand tracking software supported by Interhaptics: Oculus, Vive, Leap Motion and the other planned in the future.

Here some examples:

Precision Grab

precision_grasp_interhaptics_gif

Power Grab with Thumb

power_grasp_with_thumb_interhaptics_gif

Power Grab without Thumb

power_grasp_without_thumb_interhaptics_gif

Extreme cases

extreme_cases_interhaptics_

As you can see, the handtracking is a main part of the XR experience. For us, its development is the result of a long reflexion to improve each characteristics of its features. By observing the way each of us grabs of pulls something, we built the most realistic technology in order to maintain and optimize your own XR use. But this process is never over. It’s constantly enriched by your feedbacks.

Click here to see our entire video dedicated to handtracking and subscribe to our channel for more content. These features will be shortly available on the upcoming version of Interhaptics.

Check out all our articles here to read about how haptics keep you immersed into your VR experiences. Extend your reality now by downloading Interhaptics and design, develop and deploy your own interactions.

Hey! Why don't you leave us a comment? We'd love to hear from you!

You must be logged in to make a comment. Click here to login.



3 responses to “Hand Tracking for XR: Natural inspired UX and input”

  1. […] our last blog post, we explained how to create 3D interaction for XR. Now it’s time to understand how to use it. As a reminder, Interhaptics is a cross-platform […]

  2. […] Each interaction is triggered logically and the behavior of the object in the 3D space is also logically managed therefore you just need to apply script and the object will behave as it should. A small detail that is actually important in terms of user experience is the differentiation between big and small objects. It was a necessity since we do not grab objects the same way depending on their size (have more details about different ways to grab here). […]

  3. […] Each interaction is triggered logically and the behavior of the object in the 3D space is also logically managed therefore you just need to apply script and the object will behave as it should.A small detail that is actually important in terms of user experience is the differentiation between big and small objects. It was a necessity since we do not grab objects the same way depending on their size (have more details about different ways to grab here). […]