Virtual Reality - Capi-Metaverse/Template GitHub Wiki

This page will talk about how to install all the necessary packages for the correct functioning of a Unity OpenXR virtual reality project. Also, we will explain how to create a character, how to make it able to move through a 3d environment and how to make it able to interact and grab objects from that environment.

Virtual Reality package installation

Step 0

From Unity, in the options tab at the top, go to 'Window' and access the 'Package Manager' Then click at the cross at top left and select 'add package from git URL' or 'add package from name'.

packageManager

Step 1

Install the XR Plugin Management, version 4.4.1 or higher com.unity.xr.management, this package will manage the rest of the virtual reality packages in a simpler way.

Step 2

Install OpenXR Plugin, version 1.5.3 com.unity.xr.openxr has been installed although each version improves multiple details of the previous ones, it is advisable to stay in a stable version for the correct operation of the package.

Step 3

Install XR Interaction Toolkit, by default version 2.0.4 is usually installed but in this case it has been updated to version 3.0.1 com.unity.xr.interaction.toolkit as it allows the installation of 'starter assets' very useful to check the correct functioning of the virtual reality packages. In addition, the VR 'character' that comes in the demo scene can be used as a base for the one you want to create. Unity Interaction Toolkit manual

VR Character

Once the OpenXR packages and the Interaction Toolkit are installed, we can proceed to assemble our character. The 'character' consists of an 'XR Rig' that is responsible for making the character the height of the player and cause you can walk using your feet in the 3D environment. Inside the 'XR Rig' are nested the different 'GameObjects' needed. These are:

A camera that carries a component called the 'Tracked Pose Driver' that makes it possible for Unity’s camera to follow your actual head movements when you have the VR headset on. The left and right controllers that are responsible for converting the input from the controllers into actions for Unity. These actions include; 'direct interactor', responsible for interacting with objects when the 'collider' of the controller enters an interacting object, the 'ray interactor' that allows you to do the same as the previous one, but at a distance thanks to a ray that comes out of the controllers and the 'teleport interactor', responsible for teleporting to the selected location based on where you are pointing. WARNING: If you want to teleport the VR Character you have to teleport the 'XR Origin' GameObject, if you teleport the main father, it may not work.

vrCharacter

To enhance the immersion, the visual aspect of the controllers seen from the virtual reality has been modified. The Oculus Hands Unity Package provides hands models and a lot of animations for them. We can exchange the aforementioned controllers for human hands that make animations depending on the button using the created script 'AnimateHandOnInput'.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.InputSystem;

public class AnimateHandOnInput : MonoBehaviour
{
    public InputActionProperty pinchAnimationAction; //add interaction/activate value
    public InputActionProperty gripAnimationAction;  ///add interaction/select value
    public Animator handAnimator;

    void Update()
    {
        float triggerValue = pinchAnimationAction.action.ReadValue<float>();
        handAnimator.SetFloat("Trigger", triggerValue);

        float gripValue = gripAnimationAction.action.ReadValue<float>();
        handAnimator.SetFloat("Grip", gripValue);

    }

}

This script must be placed in the 'prefab' of the left and right hands, passing as parameters the actions of pressing the 'trigger' and the side button of the controller of each hand respectively, as well as the 'animator' of each one of these.

Teleportation area

To create an area in which the character can teleport using the controls we must assign to the corresponding floor the component 'teleportation area', to which we must pass the 'collider' of the floor and change the 'Interaction Layer Mask' to teleport. This Layer is in charge of determining with which objects the 'raycast' of hands interact with the floor, if it is not changed from the default, the teleportation will be performed with the same input as grabbing objects.

tpArea

Interactable objects

To make an object grabbable it is necessary to add the 'XR Grab Interactable' component, which needs to be passed as a parameter the 'collider' of the object. A rigidbody without gravity or kinematic is also needed, however this will be added only if 'XR Grab Interactable' is added. To add gravity to an object when is dropped is needed to check the 'Force Gravity on Detach' option.

Within the component functions we can select if we want the object to be grabbed with one or two hands using the 'Select Mode' option (Single or multiple), the 'Movement Type', which is recommended to be set to 'velocity Tracking' for greater fidelity with real life, or if we want the object to be grabbed freely or cause the hand to adapt to a specific position using the 'Dynamic attach' option. Using 'Far-Near interaction' selector is posible to decide if the object can be grabbed from the distance or only when the virtual hand collides.

Also to add new functionalities related with interactable objets there are Interactable Events, like do anything on Grab or on Drop an object.

interactableObject
⚠️ **GitHub.com Fallback** ⚠️