Destructible Global Mesh, Instant Content Placement and MRUK Trackables – Meta’s MRUK v71 update

  • 22 January 2025
  • 22 January 2025
  • 3 min read

Hello XR Developers!

Meta’s MR Utility Kit (MRUK) adds powerful utilities on top of the Scene API to simplify building spatially-aware apps. It provides tools for ray-casting against scene objects, finding spawn locations, placing objects on surfaces, and achieving dynamic lighting effects. In this quick overview of MRUK v71, we’ll explore exciting new features. If you’re new to MRUK, check out the introduction video first!

New Sample Scenes in v71

Updating to MRUK v71 introduces new sample scenes under the Samples tab:

  • Destructible Mesh
  • Environment Panel Placement (Beta)
  • Keyboard Tracking

We’ll walk through these features step-by-step, starting with the Destructible Mesh system.

Destructible Mesh System

  1. Setup:
    • Import Camera Rig and Passthrough Building Blocks.
    • Import the MRUK prefab.
    • Create an empty GameObject called DestructibleMesh and add DestructibleGlobalMeshSpawner and DestructibleMeshExperience components.
  2. Core Components:
    • DestructibleGlobalMeshSpawner handles creating global destructible meshes. It registers callbacks for room creation and removal, customizes segmentation density, and supports reserved spaces (indestructible regions).
    • DestructibleMeshComponent segments and manages destructible mesh parts. It uses SegmentMesh to split meshes, defining reserved areas via reservedMin and reservedMax vectors.
    • DestructibleMeshExperience provides user-facing controls for destruction. Inputs include index triggers for segment destruction and hand grips for enabling debug visuals.
  3. Customization Options:
    • Segmentation Density: Adjust PointsPerUnitX, PointsPerUnitY, and MaxPointsCount for performance.
    • Reserved Spaces: Prevent floor and ceiling destruction.
    • Global Mesh Material: Assign a material to all mesh segments.

Tip: Use Destructible Mesh to create interactive effects, like revealing portals, just as seen in First Encounters.

Instant Content Placement (Beta)

Instant Content Placement removes the need for room scanning. Using depth-based raycasting, developers can:

  • Detect surfaces in real-time.
  • Place objects with minimal user input.

Components:

  • EnvironmentRaycastManager manages raycasting against depth data.
  • EnvironmentPanelPlacement demonstrates how to place objects on walls or vertical surfaces.

Key Methods:

  • Raycast: Performs depth-based detection, returning hit points and surface normals.
  • TryGetEnvironmentPose: Finds suitable placement poses.

This feature accelerates development and simplifies spatial interactions.

Improved Keyboard Tracking

MRUK now includes built-in support for keyboard tracking:

  1. Enable Experimental Features in the OVR Manager.
  2. Use KeyboardManager for detection and visualization.

Core Functionalities:

  • OnTrackableAdded and OnTrackableRemoved track physical keyboards.
  • Prefab instantiation aligns a virtual model with the real keyboard.
  • Keyboard Input Listener displays keystrokes in a UI field.

Before starting, pair your Bluetooth keyboard in Meta Quest Settings under Devices > Keyboard.

Visualize passthrough cutouts for accurate keyboard alignment.

Conclusion

MRUK v71 introduces powerful features to enhance spatial awareness and interaction. From destructible environments to instant object placement and seamless keyboard tracking, there’s plenty to explore. Try these tools in your next XR project and share your experiences! If you enjoy this content, leave a like, subscribe, and support us on Patreon for exclusive access to source code and tutorials. Join our XR developer community on Discord—see you there!

Support Black Whale🐋

Thank you for following this article. Stay tuned for more in-depth and insightful content in the realm of XR development! 🌐

Did this article help you out? Consider supporting me on Patreon, where you can find all the source code, or simply subscribe to my YouTube channel!

Leave a Reply

Your email address will not be published. Required fields are marked *