August 2, 2022
Working on VR apps sometimes means creating complex interactable objects such as tools.
This tutorial explains how to create a barcode scanner that works in VR using Unity’s XR Interaction Toolkit. The scanner is activated by holding the trigger on the VR controller and if it’s positioned in front of a valid label it will scan the label's barcode and provide acoustic feedback, i.e. a "beep" will be played.
You are free to read the scanned value and use it in your application’s logic, for example to fetch information from a database.
In this tutorial, we’ll cover:
Time to get started!
We will start from an empty Unity project with the default 3D template. We used Unity 2021.3.6f1, but any 2020 to 2022 version should be fine as well. Note however, that there might be small differences in the hardware setup, depending on the device you are using.
We need to install the XR Interaction Toolkit from the Package Manager. If you are not sure how to do this, check out this XR Interaction Toolkit setup tutorial. There you will learn how to install the XR Interaction Toolkit, configure your project for VR and set up your rig.
Let’s set up a new scene and add a rig. We set up the rig by following the tutorial linked above. That left us with a working rig with two rays in place of controllers. However, we want to grab things with our hands directly, so let’s replace those with direct interactor.
To do so, navigate to the Left Hand Controller and Right Hand Controller, remove the ray interactor components and replace them both with a XR Direct Interactor component.
The direct interactor also requires a trigger collider to know if it’s touching something. For this, add a Sphere Collider on each controller, enable Is Trigger and set the radius to something like 0.1. This means that you will be able to interact with objects that are within 10 cm distance of the exact coordinates of your controller in VR.
Now the grab functionality has been set up.
Since we have not added any mesh to the rig yet, we are not able to see our controllers in the virtual environment.
For the sake of simplicity, we created two default cubes, scaled them down and colored them blue. For reference, these are the transform values we used.
We then set each one as a child of one of the hand controller game objects. We then set Model Parent to the controller itself on each of the controllers.
This is what our “hands” look like now. They may not look like much, but they work! If you prefer something more sophisticated, you can use controller models or animated hands as well but we won't cover this in this tutorial.
With the controllers sorted out, let’s create the rest of the scene. We created a simple plane to represent the floor and a cube to use as a table. Again, it’s function over form - we just need a surface we can place our tools on.
Finally, let’s import the contents of this Unity package. It contains a simple model for the barcode scanner and a few scripts that will be detailed later.
First, let’s import in the scene the BarcodeScanner prefab from the Prefabs folder. This contains only the mesh and colliders for the scanner. We will do the rest during this tutorial. Just put it on the table.
Next, we want to make it possible to grab it in VR. For this, add the XR Grab Interactable component to the game object. It will automatically add a rigid body as well. We recommend making the object kinematic as well, so that it does not fall to the ground when releasing it in VR.
Press play and see what happens. You will notice that you can grab the object, and that it is always grabbed in the same position and orientation.
That’s actually what we want. We want the scanner to always be grabbed by the handle, not from anywhere with precision grabbing. Still, it would be better if it was positioned so that the handle and trigger roughly match the shape of the controller. Now it’s not grabbed in the correct position, but somewhere at the center of the scanner.
To fix that, we need to specify the point where the object needs to be grabbed. The prefab already has an empty child object in the correct position, called AttachTransform. Drag it in the Attach Transform field of the XR Grab Interactable component and press Play again. Now the object should be grabbed at the correct orientation, with the scanner’s trigger roughly in the same position as the trigger on the controller. In case this does not work for your setup, feel free to adjust the position of the AttachTransform object as needed.
Now we want something to happen when we press the trigger on the controller and activate the barcode scanner. Eventually, we’ll want it to scan codes, but for now let’s start by just animating the trigger of the scanner to mimic the one on the controller.
For this, add the AnimateTrigger script, found in the Scripts folder, to the barcode scanner.
We need to drag the Trigger and the TriggerPressedPosition game objects in the relevant fields to configure it. These two objects are respectively the trigger itself, and an empty object at the position it should be when pressed.
This very simple script contains two public methods PressTrigger() and ReleaseTrigger(), which take care of animating the trigger. We’ll need to call those at the right time.
To do so, scroll to the Events section of the XR Grab Interactable component and find the Activated and Deactivated events. These are called when the trigger is pressed or released. Add a call, drag the scanner in it and select the above methods from the Animate Trigger component.
Press play again, and try it out!
Now we want the scanner to actually read barcodes. For that we need a barcode, so let’s start by dragging the Box prefab in the scene. This is a cube with a child object that represents the label with the barcode (a white square).
As usual, the prefab contains only meshes and colliders. The Label child object represents the label to be scanned. It has no logic yet, but it’s important to note that it has a collider. It is important that your labels have a collider or a trigger on them, as they will be scanned by raycasting.
Let’s add the remaining logic to the label. Add the BarcodeLabel component to the Label game object.
The scanner will look for this component when it scans an object, and every label can be assigned a string value which identifies it. Type something in the value field, for instance "my test label".
This is the data that will be read by the scanner. What it actually represents depends on your use case. It could be a unique identifier which can be used to fetch information out of a database, for example.
The next thing to do is to give the scanner the ability to read that label.
This can be done thanks to the code in the BarcodeScanner script. This script can raycast for a label, and if it hits one it reads its string value. The value is stored in the LastCodeScanned property, and the CodeScanned event is called on a successful scan.
You will be able to register your own functions on the CodeScanned event in order to make use of the tool.
Add the script to the BarcodeScanner game object and let's configure it. It requires a transform which determines the origin and direction of the ray, and a float representing the tool’s scanning distance. Drag the already set up RayEmitter child object as the ray transform, so that the ray will come from the front of the tool. The distance can be left to the default of 0.1 units. If you want to be able to scan larger distances, you can of course increase the value. Note however, that only the first hit will be considered, so you can not scan through other colliders.
Then, add the StartScanning and StopScanning function from the script you just added to the Activated and Deactivated events on the interactable component, just like you did for the trigger animation script. This will ensure that the tool will start and stop scanning for labels when the trigger is pressed/released.
When pressing play, the tool will work as intended, but we will not have any confirmation the string has been scanned. First, let’s confirm the scanner script works correctly by subscribing to the CodeScanned event and printing a debug log with the scanned code.
Do so by adding to the scanner game object the BarcodeScannerDebug script, which does exactly that.
Now, if we scan the label we will see a message like the following in the log. Make sure log messages are enabled in your console.
It is nice to know that things work properly, but this will not help the user in VR. We can easily play a sound when a label is scanned by subscribing to the CodeScanned event. Let’s do this by creating an Audio Source component to the scanner game object and then configuring it.
As the audio clip, we can choose the Beep file included in the tutorial package. Check that Play on Awake and Loop are disabled to ensure our intended behavior. Move the Spatial Blend slider all the way to the right. This means that the barcode scanner is the source of the sound and the user's distance to it will determine the volume of the sound heard.
Now that the audio source is configured, let’s bind it to the event on the Barcode Scanner component. Drag the scanner object in the field, reference the audio source, and select the Play() method, which plays the referenced clip just once.
Now, when pressing Play, a loud and clear beep will inform us we successfully scanned the label!
Congratulations! You now have a functional barcode scanner and know how to create labels for it. It’s up to you to put them to good use in a functional application. You can read the LastCodeScanned property or subscribe to the CodeScanned event to hook up your own logic.
If you are interested in creating a linear process using it, such as a training course or a tutorial, you could also check out VR Builder. VR Builder automates some steps in the creation of interactable tools like this one and makes it easy to integrate them into a working application. For example, when a code is scanned, you can make something happen in the application with a visual node editor.
Since VR Builder uses the XR Interaction Toolkit as well, you can reuse most of the barcode scanner we just created. The process is not 100% identical, however (and somewhat easier).