Sensor interfaces - rfilkov/k4adocs GitHub Wiki

The K4A-package supports multiple sensor setups, as well as several types of sensors by design (Azure Kinect, Kinect-v2 & RealSense D400). In all demo scenes the available sensor interfaces reside on child objects of the KinectController-game object. At start up, the KinectManager tries to find the sensor interfaces on these child objects. If nothing is found, it creates a single Kinect4Azure interface on the same game object. Feel free to duplicate, create or remove sensor-interface objects, according to your specific setup. You can also change the sensor-interface settings, to match your setup needs. Please note, the transform-component of each sensor-interface object represents the sensor’s position and rotation in the scene.

Here are the common settings of all sensor-interface components:

Setting Description
Device Streaming Mode Device streaming mode, in means of connected sensor (live data), play a recording, or save the live sensor data to a recording.
Device Index Index of the depth sensor in the list of currently connected sensors.
Recording File Path to the recording file, if the streaming mode is PlayRecording or SaveRecording.
. .
Min Distance Minimum distance in meters, used for creating the depth-related images.
Max Distance Maximum distance in meters, used for creating the depth-related images.
. .
Point Cloud Resolution Resolution of the generated point-cloud textures.
Point Cloud Vertex Texture Render texture, used for point-cloud vertex mapping. The texture resolution should match the depth or color image resolution.
Point Cloud Color Texture Render texture, used for point-cloud color mapping. The texture resolution should match the depth or color image resolution.
Point Cloud Player List List of comma-separated player indices to be included in the point cloud. Use -1 for all players, or empty list for full point cloud.
. .