Advanced Hand Tracking and Lighting Fast Interaction Setup

  • 2 March 2024
  • 2 March 2024
  • 3 min read

Welcome, XR Developers! Are you ready to dive into the latest enhancements Meta’s XR SDK version 62 has to offer? This guide will walk you through the exciting new features for hand tracking and interaction SDK, including Multimodal, Wide Motion Mode (WMM), Cap-sense, and Meta’s comprehensive interaction sample. Let’s enhance your XR projects with these cutting-edge capabilities!

🛠 Setting Up Your Environment

Before we explore the new features, it’s crucial to set up your development environment properly. Here’s how to get started:

  1. Create a New Unity Project: Use the all-in-one package for the Meta XR SDK. For core package users, install the interaction SDK.
  2. Install Interaction SDK Samples: Download the “Example Scenes” from the “Samples” tab in the Package Manager.
  3. Configure Your Project: Utilize Meta’s Project Setup Tool and switch the platform to Android for device testing.
  4. Open the “Concurrent Hands Controllers Examples” Scene: This scene has Multimodal pre-configured, ready for you to explore.

🤝 Multimodal Integration

Multimodal allows simultaneous use of hands and controllers. However, it has its limitations, such as incompatibility with Wide Motion Mode and tracked keyboards. Setting up Multimodal is straightforward:

  • Enable “Simultaneous Hands and Controllers” in the OVR Manager component.
  • Set “Controllers and Hands” for Hand Tracking Support.
  • Ensure you have detached left and right hand anchors in your tracking space.

🔄 Wide Motion Mode (WMM)

WMM extends hand tracking capabilities beyond the headset’s field of view using Inside-Out Body Tracking (IOBT). To enable WMM:

  • Import the “Features Examples” samples.
  • Open the “Debug Body Joints” scene.
  • Enable “Wide Motion Mode Hand Poses Enabled” and set Body Tracking support to “required”.

✋ Cap-sense Hands

Recently out of beta, Cap-sense enhances hand representation when using controllers. To use Cap-sense:

  • Select the tracking state for hands and controllers.
  • Adjust settings in the OVR Manager, choosing between “None,” “Conforming To Controller,” and “Natural” for hand poses.

🛠 Interaction SDK Enhancements

Meta’s Interaction SDK now includes quick action menus for setting up interactions in seconds:

  • Add a cube to your scene and right-click to select “Add Grab Interaction” from the “Interaction SDK” menu.
  • Meta automatically adds all necessary components for grab interactions.

🎮 Comprehensive Rig Example Scene

Explore Meta’s “Comprehensive Rig Example” scene, packed with features like a menu system, object interactions, a video player similar to YouTube, and various locomotion methods. This scene is a treasure trove for developers seeking inspiration and ready-to-implement solutions.

Conclusion

Meta’s XR SDK version 62 brings exciting new features to elevate your XR development projects. From seamless integration of hand tracking and controller interactions to quick setup tools and comprehensive examples, these enhancements promise to inspire and streamline your development process.

Support Black Whale🐋

Thank you for following this article. Stay tuned for more in-depth and insightful content in the realm of XR development! 🌐

Did this article help you out? Consider supporting me on Patreon, where you can find all the source code, or simply subscribe to my YouTube channel!

Leave a Reply

Your email address will not be published. Required fields are marked *