How to run - rfilkov/k4adocs GitHub Wiki
How to Run 'Azure-Kinect Examples for Unity'
Despite its name, ‘Azure-Kinect Examples for Unity’ can work with several depth sensors – Azure-Kinect, RealSense and Kinect-v2. The installation depends on what sensor you have at your disposal.
- (Azure Kinect) Download and install Azure-Kinect Sensor SDK, as described in the ‘Azure-Kinect SDKs’-section below.
- (Azure Kinect) Download and install Azure-Kinect Body Tracking SDK, as described in the ‘Azure-Kinect SDKs’-section below. Then open ‘Azure Kinect Body Tracking Viewer’ to check, if the body tracker works as expected.
- (Kinect-v2) Download and install Kinect SDK 2.0, as described in the ‘Kinect-v2 SDK’-section below.
- (RealSense) Download and install RealSense SDK 2.0, as described in the ‘RealSense SDK’-section below.
- (iPhone Pro) For integration with the iPhone’s LiDAR sensor, please look at this tip.
- Import this package into new Unity project.
- Open ‘File / Build settings’ and switch to ‘PC, Mac & Linux Standalone’, target platform: ‘Windows’ & Architecture: ‘x86_64’, and in ‘Player settings’ make sure the ‘Scripting backend’ is set to ‘Mono’.
- Make sure that Direct3D11 is the first option in the ‘Auto Graphics API for Windows’-list setting, in ‘Player Settings / Other Settings / Rendering’.
- Open and run a demo scene of your choice from a subfolder of the 'AzureKinectExamples/KinectDemos'-folder. Short descriptions of the available demo scenes are available here.