Version: 2019.1 (switch to 2018.3 or 2017.4)
Spatial Mapping common troubleshooting issues
Input for Windows Mixed Reality
Other Versions

Unity XR input

This section provides information on all Unity supported input devices used to interact in Virtual RealityA system that immerses users in an artificial 3D world of realistic images and sounds, using a headset and motion tracking. More info
See in Glossary
, Augmented Reality and Mixed Reality applications.

XR input mappings

XR platforms typically provide a rich diversity of input features for you to take advantage of when designing user interactions. Positions, rotations, touch, buttons, joysticks, finger sensors all provide specific pieces of data. At the same time, different XRAn umbrella term encompassing Virtual Reality (VR), Augmented Reality (AR) and Mixed Reality (MR) applications. Devices supporting these forms of interactive applications can be referred to as XR devices. More info
See in Glossary
platforms provide access to these input features in ways that can vary subtly, such as between the Vive and the OculusA VR platform for making applications for Rift and mobile VR devices. More info
See in Glossary
, or drastically, such as between a desktop VR platform and a mobile platform like Daydream.

Unity defines a standard set of feature usages that you can use across XR platforms to access user input in a platform-agnostic way. For example, Unity defines a “Trigger” feature usage as a single-axis input on all XR platforms. With the Feature Usage API, you can get the trigger state by name rather than setting up an axis (or a button on some XR platforms) for the conventional Unity Input system.

The following table lists the standard controller input feature usage names and how they map to the controllers of popular XR systems:

Feature Usage FeatureType Legacy Input Index [L/R] WMR Oculus GearVR Daydream OpenVR (Full) Vive OpenVR (Oculus) OpenVR (WMR)
Primary2DAxis 2D Axis [(1,2)/(4,5)] Joystick Joystick Joystick Touchpad [Trackpad/Joystick] Trackpad Joystick Joystick
Trigger Axis [9/10] Trigger Trigger Trigger Trigger Trigger Trigger Trigger Trigger
Grip Axis [11/12] Grip Grip Grip Grip Grip Grip Grip
IndexTouch Axis [13/14] Index - Near Touch
ThumbTouch Axis [15/16] Thumb - Near Touch
Secondary2DAxis 2DAxis [(17,18)/(19,20)] Touchpad Touchpad
IndexFinger Axis [21/22] Index
MiddleFinger Axis [23/24] Middle
RingFinger Axis [25/26] Ring
PinkyFinger Axis [27/28] Pinky
CombinedTrigger Axis [3/3] CombinedTrigger Combined Trigger Combined Trigger Combined Trigger Combined Trigger Combined Trigger
PrimaryButton Button [2/0] [X/A] App Primary Primary Primary [Y/B] Menu
PrimaryTouch Button [12/10] [X/A] - Touch
SecondaryButton Button [3/1] [Y/B] Alternate Alternate [B/A]
SecondaryTouch Button [13/11] [Y/B] - Touch
GripButton Button [4/5] Grip - Press Grip - Press Grip - Press Grip - Press Grip - Press Grip - Press Grip
TriggerButton Button [14/15] Trigger - Press Index - Touch Trigger - Press Trigger - Press Trigger - Press Trigger - Press Trigger - Touch Trigger-Press
MenuButton Button [6/7] Menu Start (6)
Primary2DAxisClick Button [8/9] Touchpad - Click Thumbstick - Click Touchpad - Click Touchpad - Click StickOrPad - Press StickOrPad - Press StickOrPad - Press Touchpad - Click
Primary2DAxisTouch Button [16/17] Touchpad - Touch Thumbstick - Touch Touchpad - Touch Touchpad - Touch StickOrPad - Touch StickOrPad - Touch StickOrPad - Touch Touchpad - Touch
Thumbrest Button [18/19] Joystick - Click ThumbRest - Touch

See XR.CommonUsages for a definition of each feature usage.

Accessing input devices

Use the XR.InputDevices class to get the input devices (such as controllers and trackers) that are currently connected to the XR system.

Use InputDevices.GetDevices() to get a list of all connected devices:

var inputDevices = new List<UnityEngine.XR.InputDevice>();
UnityEngine.XR.InputDevices.GetDevices(inputDevices);
foreach (var device in inputDevices)
{
    Debug.Log(string.Format("Device found with name '{0}' and role '{1}'", 
              device.name, device.role.ToString()));
}

An InputDevice object remains valid across frames until the XR system disconnects it. You can use the InputDevice.isValid property to determine whether an InputDevice object still represents an active controller. (Note that there can be considerable lag between the XR system losing its connection to a device and it becoming invalid.)

Accessing input devices by role

A device role describes the general function of an input device. Use the InputDeviceRole enumeration to specify a device role. The defined roles include:

  • GameController — a console-style game controllerA device to control objects and characters in a game.
    See in Glossary
    .
  • Generic — a device that doesn’t fit another role definition. For example, an HMD is typically reported as a generic device.
  • HardwareTracker — a tracking device.
  • LeftHanded — a device associated with the user’s left hand.
  • RightHanded — a device associated with the user’s right hand.
  • TrackingReference — a device that tracks other devices. For example, the Oculus tracking camerasA component which creates an image of a particular viewpoint in your scene. The output is either drawn to the screen or captured as a texture. More info
    See in Glossary
    are reported as tracking reference.

These roles are reported by the underlying XR SDK and different providers can organize their device roles differently. In addition, a user can switch hands, so the role assignment may not match the physical hand holding the input device. For example, a user must set up the Daydream controller as right or left handed, but can hold the controller in the opposite hand just as easily.

GetDevicesWithRole() provides a list of any devices with a specific InputDeviceRole. For example, you can use the InputDeviceRole.GameController role to get any connected GameController devices:

   var gameControllers = new List<UnityEngine.XR.InputDevice>();

   UnityEngine.XR.InputDevices.GetDevicesWithRole(UnityEngine.XR.InputDeviceRole.GameController, gameControllers);
   foreach (var device in gameControllers)
   {
       Debug.Log(string.Format("Device name '{0}' has role '{1}'", 
                 device.name, device.role.ToString()));
   }

Accessing input devices by XR node

XR nodes represent the physical points of reference in the XR system. For example, the user’s head position, their right and left hands, and a tracking reference such as an Oculus camera are all XR nodes. The defined nodes include:

  • CenterEye: a point midway between the pupils of the user’s eyes.
  • GameController: a console-style game controller. Multiple game controller nodes can exist.
  • HardwareTracker: a hardware tracking device, typically attached to the user or another object. Multiple hardware tracker nodes can exist.
  • Head: the center point of the user’s head (as calculated by the XR system).
  • LeftEye: the user’s left eye.
  • LeftHand: the user’s left hand.
  • RightEye: the users right eye.
  • RightHand: the user’s right hand.
  • TrackingReference: a tracking reference point, such as the Oculus camera. Multiple tracking reference nodes can exist.

The XRNode enumeration defines the available nodes.

Use InputDevice.GetDevicesAtXRNode() to get a list of devices associated with a specific XRNode. For example, to get a left-handed controller:

var leftHandDevices = new List<UnityEngine.XR.InputDevice>();
UnityEngine.XR.InputDevices.GetDevicesAtXRNode(UnityEngine.XR.XRNode.LeftHand,
                                                 leftHandDevices);
if(leftHandDevices.Count == 1)
{
    UnityEngine.XR.InputDevice device = leftHandDevices[0];
    Debug.Log(string.Format("Device name '{0}' with role '{1}'", 
                            device.name, device.role.ToString()));
}
else if(leftHandDevices.Count > 1)
{
    Debug.Log("Found more than one left hand!");
}

Accessing Input Features on an Input Device

Read an input feature, such as the state of a trigger button, from a specific InputDevice object. For example, to read the state of the right trigger, first get an instance of the right-handed device (using InputDeviceRole.RightHanded or XRNode.RightHand). Once you have the correct device, use the InputDevice.TryGetFeatureValue() function to access the current state.

TryGetFeatureValue() attempts to access the current value of a feature, but can fail if the current device does not support the specified feature or the device object is invalid (a controller is no longer active). The function returns true if it successfully retrieves the specified feature value, and returns false if it fails.

The easiest way to get a particular button, touch input, or joystick axis value is to use the CommonUsages class. CommonUsages includes all the feature usages in the input mapping table, as well as tracking features like position and rotation. For example, the following code uses CommonUsages.triggerButton to detect whether the player is currently pulling the trigger button on a particular InputDevice instance:

bool triggerValue;
if (device.TryGetFeatureValue(UnityEngine.XR.CommonUsages.triggerButton, 
                              out triggerValue) 
    && triggerValue)
{
    Debug.Log("Trigger button is pressed");
}

You can also use the TryGetFeatureUsages() function to get a list of all the feature usages provided by a device. This function returns a list of InputFeatureUsage objects, which have a name and type property describing the feature. The following example enumerates all the Boolean features provided by a given input device:

var inputFeatures = new List<UnityEngine.XR.InputFeatureUsage>();
if (device.TryGetFeatureUsages(inputFeatures))
{
    foreach (var feature in inputFeatures)
    {
        if (feature.type == typeof(bool))
        {
            bool featureValue;
            if (device.TryGetFeatureValue(feature.As<bool>(), out featureValue))
            {
                Debug.Log(string.Format("Bool feature '{0}''s value is '{1}'",
                          feature.name, featureValue.ToString()));
            }
        }
    }
}

Primary Button example

Different controller configurations can provide a particular feature: for example, multiple controllers on one system, different controllers on different systems, or different buttons on the same controllers with different SDKs (for example, using Oculus versus Open VR with an Oculus Rift). This diversity can make it more complicated to support input from a wide range of XR systems. The Unity Feature Usages API can help you get input in an XR platform-agnostic way.

The following example accesses the Primary Button feature usage no matter which controller or input device provides it. The example includes a class that scans the available devices for the Primary Button feature. The class monitors the value of the feature on any connected device and if the value changes, the class dispatches a UnityEvent.

To use this class, add it as a component to any GameObject in the Scene.

using System.Collections.Generic;
using UnityEngine;
using UnityEngine.Events;
using UnityEngine.XR;

[System.Serializable]
public class PrimaryButtonEvent : UnityEvent<bool>{}

public class PrimaryButtonWatcher : MonoBehaviour
{
    public PrimaryButtonEvent primaryButtonPress;
    private bool lastButtonState = false;
    private List<UnityEngine.XR.InputDevice> allDevices;
    private List<UnityEngine.XR.InputDevice> devicesWithPrimaryButton;

    void Start()
    {
        if(primaryButtonPress == null)
        {
            primaryButtonPress = new PrimaryButtonEvent();
        }

        allDevices = new List<UnityEngine.XR.InputDevice>();
        devicesWithPrimaryButton = new List<UnityEngine.XR.InputDevice>();
        InputTracking.nodeAdded += InputTracking_nodeAdded;
    }

    // check for new input devices when new XRNode is added
    private void InputTracking_nodeAdded(XRNodeState obj)
    {
        updateInputDevices();
    }

    void Update()
    {
        bool tempState = false;
        bool invalidDeviceFound = false;
        foreach(var device in devicesWithPrimaryButton)
        {
            bool buttonState = false;
            tempState = device.isValid // the device is still valid
                        && device.TryGetFeatureValue(CommonUsages.primaryButton, out buttonState) // did get a value
                        && buttonState // the value we got
                        || tempState; // cumulative result from other controllers

            if (!device.isValid)
                invalidDeviceFound = true;
        }

        if (tempState != lastButtonState) // Button state changed since last frame
        {
            primaryButtonPress.Invoke(tempState);
            lastButtonState = tempState;
        }

        if (invalidDeviceFound || devicesWithPrimaryButton.Count == 0) // refresh device lists
            updateInputDevices();
    }

    // find any devices supporting the desired feature usage
    void updateInputDevices()
    {
        devicesWithPrimaryButton.Clear();
        UnityEngine.XR.InputDevices.GetDevices(allDevices);
        bool discardedValue;

        foreach (var device in allDevices)
        {
            if(device.TryGetFeatureValue(CommonUsages.primaryButton, out discardedValue))
            {
                devicesWithPrimaryButton.Add(device); // Add any devices that have a primary button.
            }
        }
    }
}

The following PrimaryReactor class uses the PrimaryButtonWatcher defined above to detect when you press a primary button and, in response to a press, rotates its parent GameObject. To use this class, add it to a visible GameObject, such as a Cube, and drag the PrimaryButtonWatcher reference to the watcher property.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;

public class PrimaryReactor : MonoBehaviour
{
    public PrimaryButtonWatcher watcher;
    public bool IsPressed = false; // public to show button state in the Unity Inspector window
    public Vector3 rotationAngle = new Vector3(45, 45, 45);
    public float rotationDuration = 0.25f; // seconds

    private Quaternion offRotation;
    private Quaternion onRotation;
    private Coroutine rotator;

    void Start()
    {
        watcher.primaryButtonPress.AddListener(onPrimaryButtonEvent);
        offRotation = this.transform.rotation;
        onRotation = Quaternion.Euler(rotationAngle) * offRotation;
    }

    public void onPrimaryButtonEvent(bool pressed)
    {
        IsPressed = pressed;
        if (rotator != null)
            StopCoroutine(rotator);

        if (pressed)
            rotator = StartCoroutine(AnimateRotation(this.transform.rotation, onRotation));
        else
            rotator = StartCoroutine(AnimateRotation(this.transform.rotation, offRotation));
    }

    private IEnumerator AnimateRotation(Quaternion fromRotation, Quaternion toRotation)
    {
        float t = 0;
        while(t < rotationDuration)
        {
            transform.rotation = Quaternion.Lerp(fromRotation, toRotation, t/rotationDuration);
            t += Time.deltaTime;
            yield return null;
        }
    }
}

XR Input through Legacy Input System

You can poll XR input features via the legacy input system, using the appropriate legacy input indices from the table above. Create an axis mapping in Edit > Settings > Input to add the appropriate mapping from the input name to the axis index for the platform device’s feature. The button or axis value can be retrieved with Input.GetAxis or Input.GetButton by passing in the now-mapped axis or button name.

Haptics

You can retrieve input devices for any currently tracked XRNode. If there is a valid input device at a tracked XRNode, Unity can route haptic data to the corresponding input device to provide the wearer with immersive feedback. Unity can send haptic data either as a simple impulse with amplitude and duration, or as a buffer of data.

Not all platforms support all types of haptics, but you can query a device for haptic capabilities. The following example gets an input device for the right hand, checks to see if the device is capable of haptics, and then plays back an impulse if it is capable:

InputDevice device = InputDevices.GetDeviceAtXRNode(XRNode.RightHand);
HapticCapabilities capabilities;
if(device.TryGetHapticCapabilities(out capabilities))
{
    if(capabilities.supportsImpulse)
    {
        uint channel = 0;
        float amplitude = 0.5f;
        float duration = 1.0f;
        device.SendHapticImpulse(channel, amplitude, duration);
    }
}

Note: this code example uses the UnityEngine.XR namespace.


Did you find this page useful? Please give it a rating:

Spatial Mapping common troubleshooting issues
Input for Windows Mixed Reality