Tracking Layers - NeocortexVT/VNyanDocFork GitHub Wiki

VNyan has many layers of animation that are blended and added to your model. From the initial base layer, there are separate layers for each type of tracking (Web Camera, ARKit, Leap Motion, and four VMC layers), the pose system, the animation system, the PoseLayer (used by the Expressive Body Tilt and other plugins), and the Additive Bone Rotation nodes.

Unless otherwise specified, you should assume that all tracking layers are blended after the base layer. Some exceptions to this are the PoseLayer which applies between tracking and the Additive Bone Rotation nodes, and the Pose and Animation systems which can be set to apply the Leap Motion tracking afterwards.

Different tracking methods can be mixed to achieve the desired tracking result. However, tracking is applied in order, and tracking values from later tracking layers will overwrite those from earlier layers if applied. The order, from first to last, that tracking layers are applied is: SteamVR > VMC 1 > VMC 2 > VMC 3 > VMC 4 > ARKit > Web Camera > Leap Motion. If, for example, you want to use VMC tracking for body movements and iPhone ARKit tracking for facial expressions, then the Apply values for body parts under ARKit Tracking should be disabled.


Base Layer

In the Base Layer window, you can apply a default Arm Angle onto the base layer of your model to position your arms out of T-Pose when no arm-tracking is used. You can also apply a Root Rotation, which will adjust your model's default angle if it is not facing in the desired direction. The Base Layer window also contains the toggle to enable or disable arm tracking during vsf-animations.

Web Camera Tracking

he Web Camera Tracking window contains the configuration options for tracking through your system's webcam. Select your camera device to turn tracking on. You can change the camera settings in case a different mode works better for you. These settings are provided from your device itself, so you will only have what is available to your device. Running VNyanCapture.exe, you can find your device's recommended settings, and see if there are any issues when VNyan is trying to access your webcam. VNyanCapture.exe can be found in your VNyan install folder, under \VNyan_Data\StreamingAssets\Mediapipe\.

VNyan supports ARKit blendshape and head tracking using your Web Camera.. Note that tongueOut and cheekPuff blendshapes are currently not supported this way.

Click here for a detailed overview of adjustable tracking settings.
Setting Name Description
Use Eye Bones Use eye bones to map eye tracking.
Link Eye Blinks Force both eyes on your model to blink if one of them blinks.
Linked Blink Threshold Adjust how sensitive Linked Eye Blinks' activation is, for example to allow winking.
Mirror Tracking Inverts left and right during tracking; makes webcam tracking function as a mirror.
Convert ARKit Tracking to Simple Blendshapes *
Plant Feet Forces your model's feet to remain planted on the floor.
Track Blendshapes Toggle whether webcam tracking should be used for facial tracking.
Track Hands Enables hand tracking through webcam (Experimental)
Movement Range Caps the overall movement range of your model.
Rotation Range Caps the overall rotation range of your model.
Head Tilt Range Caps the roll of the head rotation.
Nod Range Caps the pitch of the head rotation.
Altitude Range Caps how much your model can move up and down.
Depth Range Caps how much your model can move forwards and backwards.
Pelvis Stiffness Adjusts rotation of the hip bone. *
Maintain Hip Position Adjusts movement of the hip bone.
Tracking Smoothing Applies smoothing to reduce jitter in tracking data.
Body Smoothing Add lerp-based smoothing of body movements. Each frame, bone positions and rotations are determined by the position/rotation in the last frame, the target position/rotation, and the Body Smoothing value. This value specifies the factor by which the current position/rotation values of the body are determined by the values in the previous frame. 0 = body movement perfectly follows the target transforms; 1 = body movement perfectly follows the transforms in the previous frame (effectively disables body movement).
Blendshape Smoothing Add lerp-based smoothing of blendshape changes. Each frame, blendshape values are determined by the values in the last frame, the target value, and the Blendshape Smoothing value. This value specifies the factor by which the current blendshape values are determined by the values in the previous frame. 0 = blendshape values perfectly follow the target values; 1 = blendshape values perfectly follow the values in the previous frame (effectively disables blendshape tracking).
Jitter Removal Reduces jitter in model movement. not recommended.
Hand Offset X/Y/Z Offset the standard hand position of the model.
Apply Adjust how much webcam tracking contributes to specific bone movements.
Locomotion *
Gaze Strength *

Community Note: Entries marked with a * are either missing information or contain unconfirmed information.

Under Blendshape Adjustments you can modify the weights and transformation curves of tracking values to blendshape values for tracking blendshapes. Transformation curves can be adjusted by dragging the purple squares around. Right-clicking adds or removes adjustable points on the curve. Note that these values do not affect the ARKit tracking described below.

Eye movements and blinks can be calibrated under Calibrate Eyes and Blinks. A window will pop up instructing you to look in specific directions before pressing a key to continue. Note that in order to cancel out of the calibration, you will have to close VNyan.

ARKit Tracking

VNyan supports ARKit blendshape and head tracking from most popular phone trackers, such as VTube Studio, MeowFace or iFacialMocap. Using this feature for expression tracking requires that your avatar has the 52 ARKit blendshape clips. Select your tracking app and enter the IP address of your device found within the tracking app. Tracking will start as soon as it is found by VNyan. Connection status is indicated by the coloured dot, which will show red when it is unavailable and green when connected successfully. Make sure that the computer and phone are on the same network and to configure the firewall to give VNyan access to the internet.

Click here for a detailed overview of adjustable tracking settings.
Setting Name Description
Use Eye Bones Use eye bones to map eye tracking.
Linked Eye Blinks Force both eyes on your model to blink if one of them blinks.
Linked Blink Threshold Adjust how sensitive Linked Eye Blinks' activation is, for example to allow winking.
Mirror Tracking Inverts left and right during tracking; makes ARKit tracking function as a mirror.
Convert ARKit Tracking to Simple Blendshapes *
Plant Feet Forces your model's feet to remain planted on the floor.
Track Blendshapes Toggle whether webcam tracking should be used for facial tracking.
Movement Range Caps the overall movement range of your model.
Rotation Range Caps the overall rotation range of your model.
Head Tilt Range Caps the roll of the head rotation.
Nod Range Caps the pitch of the head rotation.
Altitude Range Caps how much your model can move up and down.
Depth Range Caps how much your model can move forwards and backwards.
Pelvis Stiffness Adjusts rotation of the hip bone. *
Maintain Hip Position Adjusts movement of the hip bone.
Smoothing Add lerp-based smoothing of blendshape changes. Each frame, blendshape values are determined by the values in the last frame, the target value, and the Blendshape Smoothing value. This value specifies the factor by which the current blendshape values are determined by the values in the previous frame. 0 = blendshape values perfectly follow the target values; 1 = blendshape values perfectly follow the values in the previous frame (effectively disables blendshape tracking).
Jitter Removal Reduces jitter in model movement. not recommended.
Apply Adjust how much ARKit tracking contributes to specific bone movements.
Locomotion *
Gaze Strength *

Community Note: Entries marked with a * are either missing information or contain unconfirmed information.

Under Blendshape Adjustments you can modify the weights and transformation curves of tracking values to blendshape values for tracking blendshapes. Transformation curves can be adjusted by dragging the purple squares around. Right-clicking adds or removes adjustable points on the curve. Note that these values do not affect the ARKit webcam tracking described above.

Eye movements and blinks can be calibrated under Calibrate Eyes and Blinks. A window will pop up instructing you to look in specific directions before pressing a key to continue. Note that in order to cancel out of the calibration, you will have to close VNyan.

Leap Motion Tracking

VNyan supports hand and arm tracking through Leap Motion. Make sure you have the latest firmware for your device as VNyan requires at least the Gemini version of the Leap Motion Service.

For the best results with Leap Motion, it is recommended that you turn on the Arm Angle Setting to have your arms fall to the sides when tracking is lost - even when using Arm Sway. For certain configurations of Leap Motion, device orientation may have to be inverted through the Leap Motion UI.

Click here for a detailed overview of adjustable Leap Motion settings.
Setting Name Description
Tracker Position Indicate where the Leap Motion tracker is located.
Show Device Shows a digital version of your Leap Motion tracker in front of your model to compare assumed placement and real placement.
Position Adjust where Leap Motion is relative to your model.
Hand Distance Indicate how far from your tracker your hands are located.
Scale *
Rotation X/Y/Z *
Track Blendshapes Toggle whether webcam tracking should be used for facial tracking.
Hand Down Speed Adjust how fast your model's hands move when tracking is lost.
Hand Up Speed Adjust how fast your model's hands move when tracking is found.
Mirror Hands Swap how your hands are mapped to your model; makes Leap Motion tracking function as a mirror.
Tracking Mode Select whether Leap Motion should track your hole arms, only hand and finger movements, or only finger movements.

Community Note: Entries marked with a * are either missing information or contain unconfirmed information.

SteamVR Tracking

SteamVR tracking support was added in VNyan 1.5.0.

VNyan supports up to 11-point VR tracking.

Community Note: This section requires substantial expansion, including a detailed overview of all the configuration values.

VMC Layers

VNyan can receive tracking data from external software over the VMC protocol. VNyan supports up to four simultaneous VMC layers. VMC trackers are set up by specifying a port that the VMC software sends to. The firewall may need to be configured to allow traffic through the specified port.

You can set up VNyan such that different body parts are tracked through different softwares (e.g. hand tracking through SlimeVR and leg tracking through XR Animator). In this case, make sure that the Track sliders for different body parts are enabled and disabled accordingly, or tracking of earlier VMC layers will be overwritten by the later ones. Tracking from different VMC Receivers can be blended through a weighted average by mixing Track values.

Click here for a detailed overview of adjustable VMC Tracking settings.
Setting Name Description
Track Adjust how much the specific VMC tracker contributes to specific bone movements.
Smoothing Apply bone movement smoothing through linear interpolation. Each frame, the bone position is set to the interpolated position at the specified percentage between the final target position and the position in the previous frame. 0 = interpolated position matches target position, i.e. no smoothing; 1 = interpolated position matches previous position, effectively freezing the bone in place.
Blendshape Tracking Indicate whether Blendshape tracking values received from the specific VMC tracker should be applied to the model.
Clear on Timeout *

Community Note: Entries marked with a * are either missing information or contain unconfirmed information.

VMC Tracker Mapping

This menu allows you to set up prop trackers. Through the Props menu, a prop's transformations can be linked to those of the mapped tracker. To map a tracker, open the VMC Tracker Mapping menu, select the VNyan Tracker slot, and specify the tracker serial number or name. In the Props menu, a prop's Linked Bone can then be set to the Tracker slot, and the prop will then follow the tracker position.

To use this feature, your tracker needs to be able to send Tracker transforms through VMC-protocol. The VMC-addresses used are:

  • /VMC/Ext/Tra/Pos
  • /VMC/Ext/Con/Pos
  • /VMC/Ext/Hmd/Pos

You can find a list of all available tracker names in the Monitor-window, at the bottom under Received VMC Trackers, once your model is loaded in and VNyan is connected to VNC tracking.

⚠️ **GitHub.com Fallback** ⚠️