Hand Tracking for XR: Natural inspired UX and input
After speaking of haptics design and Interaction Building, let’s talk about one of the newest and most powerful technology for XR: Hand tracking and its input mechanisms.
Hand tracking system are bringing a new level of immersion and user experience into extended reality applications. In few words, the user sees a button, it reaches out and it clicks with their fingertips. Everything is easy there! Super user experience, little friction of usage and excellent immersion.
This feature is extremely compelling that almost every major headset maker is implementing these technologies to their headsets.
Hand tracking works undoubtedly well because the user does not have to learn anything while interacting with virtual object. The user sees an object and reacts exactly as they would IRL (In Real Life). A ball on the table? Reach out and grab it!
It can be argued that the development of a great intelligence as human spaces is tied to the ability to create and manipulate new tools and objects. When we look in detail on how we, as humans, grasp and manipulate objects we come out with a map explained in The GRASP Taxonomy of Human Grasp Types 1 illustrated in Figure 2 .
This subdivision is fundamentally split in two major categories: Power and Precision manipulation as a function of how we control the position of the object. Finger pad for precision manipulation and related to the hand palm for power manipulation.
What we see in this division is the radically different way how we manipulate different objects. How the finger is arranged and how the pre-manipulation action is made to accommodate the final position.
During the last several years working on customers projects using hand tracking technologies, we quickly realized that a unique pinching and grabbing input were not responding to the necessities of smooth user experience and generating quite a significant amount of frustration on the final user.
Pinching to move a large object is not at all expected by the user!
In some cases, we were forced to give completely up an immersive behaviour of the environment because the input actions were not consistent, and users were not fundamentally able to use the product. That means that we had a non-consistent user expectation across the product, some interactions were realistic, some of them had a high level of abstraction. This choice was not driven by user preference, but by an inconsistent response of the technology not taking into considerations human factors.
The mission of Interhaptics is to give consistent, realistic, and immersive development tools for XR creators to increase the value of the final product. For this reason, we developed in partnership with several research expert grabbing algorithms inspired the human behaviour for hand tracking technology. The objective of these algorithms is to give the best UX possible while using hand tracking technologies.
We segmented the real hand behaviour, extracted the core parameters, and transcribed it VR to create a consistent tool for hand tracking!
This will be available for any hand tracking software supported by Interhaptics: Oculus, Vive, Leap Motion and the other planned in the future.
Here some examples:
Power Grab with Thumb
Power Grab without Thumb
As you can see, the handtracking is a main part of the XR experience. For us, its development is the result of a long reflexion to improve each characteristics of its features. By observing the way each of us grabs of pulls something, we built the most realistic technology in order to maintain and optimize your own XR use. But this process is never over. It’s constantly enriched by your feedbacks.
Check out all our articles here to read about how haptics keep you immersed into your VR experiences. Extend your reality now by downloading Interhaptics and design, develop and deploy your own interactions.
Hey! Why don't you leave us a comment? We'd love to hear from you!
You must be logged in to make a comment. Click here to login.