Configuring hand models and controllers

Updated on

October 11, 2023


In this tutorial, you will learn how to configure animated hand models on VR Builder’s default rig. While there are some details specific to VR Builder, much of the content applies to any rig made with the XR Interaction Toolkit.

This is an intermediate level tutorial and assumes the user has some knowledge of the Unity engine and XR Interaction Toolkit.

With VR Builder 2.4.0, the static controllers on the default rig were upgraded to animated hands. Still, users may want to customize or replace those with their own models. Changing hand models on a rig is not as straightforward as replacing a mesh, so this tutorial will explain the basics of how the hands part of the VR Builder rig is set up, and what should be changed or checked when replacing hand models.

The Action Based Controller Manager

If you navigate through the hierarchy of the VR Builder rig and select a hand, you will notice a component called Action Based Controller Manager. Expanding the hierarchy one step further will reveal that each hand on the rig actually contains three XR Controller game objects: Base, Teleport and UI.

Generally, only one controller is needed per hand. VR Builder uses three controllers to allow you to customize and set up different keybindings or behaviors for the different use cases of manipulating objects, teleporting around, or interacting with a pointer.

The Action Based Controller Manager enables or disables the different controllers when the required input action is executed. For example, the teleport controller will be activated (and the others deactivated) when the Teleport Mode Activate input action is executed. This is bound to the trigger by default.

Each of the three controller states (Select (default), Teleport and UI) has its own set of events on this component. These are available for the user to execute functionality when the state is entered, exited or every frame while active.

Controllers and Interactors

Let’s move one step down the hierarchy. All three controllers have two things in common: a XR Controller component and an Interactor component of some kind. The XR Controller reads inputs and the position from the input bindings and translates those in XRIT actions like Select or Activate

It also references the controller model. This can be a prefab, but in our case it would result in three controller models spawning on each hand. Instead, we reference an object which is the parent to the actual model on the rig, and is the same on all three controllers: the ModelPt empty object, which is a child of the base controller.

The Interactor components are used by XRIT to interact with the various interactable objects in the scene, e.g. a grabbable object or a teleport anchor. The base controller uses a direct interactor which requires the controller to collide with the object, while the others use ray interactors to interact from afar.

Regardless, they can all interact with objects by Selecting or Activating them by pushing a button on the controller.

All interactors have a form of Attach Transform: dragging a game object in the field will use its transform as the attach point for the interaction, meaning for example that a grabbed object will snap there instead of the controller’s transform.

On the VR Builder rig, these are configured as needed so rays and interactions look natural relative to the default hand. If you change or customize the hand model, you might have to reposition these transforms as well.

Hand models

Let’s move down the hierarchy and find the hand models themselves under the ModelPt object. These have two components on them: the Animator and the Hand Animator Controller script.

The first works with Unity’s animation system and manages a state machine handling all the different hand poses. You can find more information on it in the Unity manual. The Hand Animator Controller provides the parameters driving the animator by reading them from the XR Controller. The strings specified in the Animator Parameters section should match the name of the corresponding parameters in the Unity animator.

The parameters are:

  • Select: the Select value of the controller.
  • Activate: the Activate value of the controller.
  • UI State: whether the UI controller is enabled.
  • Teleport State: whether the Teleport state is enabled.

You can reuse this component to drive animations on your custom hand/controller models. You can rename the parameters as needed, and reference specific controller and controller manager game objects if needed.

If none is specified, like on the default rig, the component will automatically search the Action Based Controller Manager in its parents and use its base controller component.

Next steps

We hope this quick overview helped you better understand the VR Builder rig and motivated you to customize and improve it. We encourage you to join our Discord community, which is the main hub to discuss and get help with VR Builder. If you like VR Builder, it would also be extremely helpful to us if you could leave a review on the Unity Asset Store. Getting more visibility and more users will help us maintain the free and open source core of VR Builder. Thanks!

Ready to get Started?

Download Vr Builder