Synthetic Humans Tutorial - Unity-Technologies/com.unity.cv.synthetichumans GitHub Wiki
This tutorial provides step-by-step instructions for integrating the Synthetic Humans package into your Unity project and generating humans. We will start by downloading and installing Unity Editor and the required packages. We will then use the sample assets provided by the Synthetic Humans package to generate diverse humans and situate them into our simulation using a variety of placement techniques.
The Synthetic Humans package relies on the Unity Perception Package to facilitate human generation and randomization. Thus, if you are not already familiar with the Perception package, we strongly recommend completing the main tutorial and the human labeling tutorial offered with the Perception package before diving into Synthetic Humans.
Throughout the tutorial, lines starting with 🟢 denote the individual actions you will need to perform in order to progress through the tutorial. The rest of the text will provide additional context and explanation around the actions. If in a hurry, you can just follow the actions!
The steps included in this tutorial are:
- Step 1: Download Unity and Create a New Project
- Step 2: Install Packages, Import Samples, Set Things Up
- Step 3: Generate Your First Humans
- Step 4: Try Different and Multiple Generation Configs
- Step 5: Use a Variety of Placement Methods
- Step 6: Add Labelers and Generate a CV dataset
ℹ️ If you face any problems while following this tutorial, please contact the Unity CV team at computer-vision[at]unity3d.com
🟢 Navigate to this page to download and install the latest version of Unity Editor 2021.3.x. (This tutorial has not yet been fully tested on newer versions.)
An alternative approach is to first install Unity Hub, which will allow you to have multiple versions of Unity on your computer, and make it easier to manage your Unity projects and the versions of Unity they will use.
🟢 Open Unity or Unity Hub and create a new project with the Unity version you just downloaded and choose the 3D (HDRP) Core template. You will first need to download the template as shown below. Name your new project Synthetic Humans Tutorial.
The project will be created and opened in Unity Editor.
🟢 Close the HDRP Wizard window if it is displayed.
ℹ️ Note that the above screenshot depicts Unity Hub version 3.2.0. Other versions of the hub may look different. Any HDRP project, no matter the template, will work for this tutorial.
For this step, you need to ensure that you have installed and properly set up both git and git lfs on your machine. Additionally, you will need to use either SSH or HTTPS authentication on GitHub.
ℹ️ The links provided here are for HTTPS. If you prefer to use SSH authentication, you can replace
https://
withssh://git@
.
IMPORTANT: Below you will find repository URLs in which you will need to replace the final part with Git tags. Use the Git tag for the current version of Synthetic Humans and the version of Perception that is compatible with it. For a list of compatible tags from both repos, look at the Package Version Compatibility section in the landing page of the Synthetic Humans repository.
🟢 Once your new project is opened, open Window -> Package Manager.
🟢 Click on the + sign at the top-left corner of the Package Manager window and choose Add package from git URL.... Use the link below or its https equivalent:
https://github.com/Unity-Technologies/perception.git?path=/com.unity.perception/com.unity.perception#<TAG, BRANCH, OR COMMIT OF DESIRED SYNTHETIC HUMANS RELEASE>
After installing Perception, if you receive an error regarding the version of the Burst package, restart Unity.
🟢 While still in Package Manager Install the Synthetic Humans package using this git url:
https://github.com/Unity-Technologies/com.unity.cv.synthetichumans.git#<TAG, BRANCH, OR COMMIT OF DESIRED Synthetic Humans RELEASE>
Downloading the Synthetic Humans package can take upwards of 15 minutes. Expect to see a progress bar stuck on "Resolving Packages" for a long time. After the package is downloaded there will be a typically lengthy import time as well.
Once the download and import is complete, you will see the Synthetic Humans package listed in Package Manager.
🟢 Select the Synthetic Humans package in Package Manager and import the sample bundle named All Samples.
The samples bundle contains a variety of assets that will help accelerate your workflows, including this tutorial. Once the samples are imported, you will see a folder named Samples
under your project's Assets
folder. Samples are organized into folders matching the version of the package they were imported from.
Synthetic Humans uses a specific HDRP skin diffusion profile to control how human skin diffuses light.
🟢 Open Edit -> Project Settings -> Graphics -> HDRP Global Settings.
🟢 Click + on the list under Diffusion Profile Assets. Note that your list might contain more items, which is OK.
🟢 In the window that opens, enable the display of non-project assets to see the skin profile included in the Synthetic Humans package (top right corner). Then select the profile named Skin.asset
, located inside the Synthetic Humans package, at Packages/com.unity.cv.synthetichumans/Resources/DiffusionProfiles/Skin.asset
.
To ensure we get clear blur-free images, we need to disable motion blur. HDRP projects have motion blur and a number of other post processing effects enabled by default. Since we will move the generated humans rapidly in our simulation, we need to make sure they don't leave blurry trails behind.
🟢 Open the Scene named TutorialScene
which is included in the samples you imported (Assets/Samples/Synthetic Humans/<package_version>/All Samples/Scenes/
).
🟢 Create an empty GameObject in your Scene and name it Volume (Right click in the Hierarchy tab and select Create Empty).
🟢 Add a Volume
component to the new object.
🟢 Set the Volume's Profile field to the DefaultSettingsVolumeProfile
asset.
🟢 Uncheck the Motion Blur option.
Installations and settings are now done! We are ready to jump into a Scene.
🟢 Create an empty GameObject and name it Scenario.
🟢 Add a FixedLengthScenario
component to your newly created object.
ℹ️ Let's brush up on our Perception knowledge a bit. Scenarios control the execution flow of your simulation by coordinating all
Randomizer
components added to them. The Perception and Synthetic Humans packages each come with a useful set of Randomizers for various common tasks, such as instantiating and placing objects, randomizing attributes such as textures and animations of objects, randomizing camera positions and post-process effects, and so on. Randomizers achieve this through coordinating a number ofParameter
s, which essentially define the most granular of randomization behaviors. For instance, for continuous variable types such as floats, vectors, and colors, Parameters can define the range and sampling distribution for randomization.
ℹ️ While their name suggests that they are meant for randomization, you can use Randomizers to carry-out any operation. The Scenario provides lifecycle hooks for you to easily override what your Randomizer does at the beginning and end of the Scenario and each Iteration of it. One of the most useful aspects of using Randomizers in this way is the clear control you will have on the order of operations during each Iteration of a Scenario, as Randomizers are executed in the order they are added to the Scenario.
🟢 Click Add Randomizer. Search for and add Human Generation Randomizer (Located under the Synthetic Humans submenu).
This Randomizer has the task of generating humans. It takes a list of Human Generation Config assets along with a probability for each. Using the probability values, each config will be used for a corresponding percentage of the final humans generated. By default this is uniform across all configs. At the start of the Scenario, a number of humans are generated (Human Pool Size). The Pool Refresh Interval Iterations specifies how often this pool is refreshed. Between pool refreshes, in each Iteration, a random subset of the humans is activated and the rest deactivated. This cycle continues until the Scenario is complete.
🟢 In the UI for the Randomizer you just added, set the range for Active Humans In Each Iteration to Min: 1 and Max: 1. This will cause exactly one human to be active in each Iteration of the Scenario.
🟢 Click Add Option on the list of Human Generation Configs in the Randomizer, and select the config named SampleHumanGenerationConfig_Default
. This config is included in the samples you imported.
The Scenario should now look like this:
It's time to see some results!
🟢 Click on the ▷ (play) button located at the top middle section of the editor to run your simulation.
You should see diverse humans being generated and shown at the center of the Game view:
ℹ️ The images and gifs in this tutorial were captured using a 1280 x 960 resolution for the Game view. You can use an identical aspect ratio to get similar output.
Let's have a look at how this is working. Human Generation Config assets are scriptable objects that describe the distribution of various human properties. This includes age, gender, ethnicity, height, weight, and the assets that should be used for generating humans, in the form of a SyntheticHumanAssetPool
type scriptable object assigned to the Asset Tag Pool field of each config. This allows configs to utilize different asset pools.
A number of other settings and properties are specified here as well. The Base Prefab lets you start from an existing prefab to create humans. For example, this can come in handy if you need to have a certain RandomizerTag component on all of the generated humans. The image below shows an example Human Generation Config. Here, the selected Asset Pool is the default one that comes with Synthetic Humans, named DefaultSyntheticHumanAssetPool
. The selected base Prefab is named HumanBase_TransformPlacement
and comes with the samples you imported earlier.
Most of the fields in the Human Generation Config UI have tooltips that pop-up when you hover your mouse pointer on them. We recommend checking these out to learn more about each field.
The Age Range, Height Range, and Weight Range properties are input using Perception Samplers, which allow for a variety of ways to specify distributions for random sampling. All three are set to Uniform by default, which returns a random number uniformly sampled from the specified range. Age is specified in years (integers). Height and weight are specified as floats between 0 and 1, with smaller values mapping to smaller weight and height. The specified range for height and weight cannot exceed [0,1].
⚠️ Note that height and weight do not scale linearly, so a value of 0.5 is not the halfway point for height and weight. We recommend experimenting with different ranges to achieve the desired outputs.
Besides the settings discussed here, the config allows you to set other properties such as the types of clothing used and joint self occlusion distance. You can also generate humans with specific pre-selected assets (e.g. a specific body material), which is mostly useful for debugging human generation assets.
We will now try modifying the generation settings and using more than one config to observe the outcomes.
🟢 Create two copies of SampleHumanGenerationConfig_Default
and name them CustomHumanGenerationConfig_1
and CustomHumanGenerationConfig_2
.
🟢 Modify CustomHumanGenerationConfig_1
to only generate adult women with the smallest values for both height and weight. To achieve this:
- Change the range for age to 20 to 65 (adult)
- Enable only the Female checkbox for sex
- Set both the height and weight ranges to [0,0] (alternatively, you can change from Uniform sampling to Constant sampling and use a value of 0, which would have the same result).
🟢 Modify CustomHumanGenerationConfig_2
to only produce women with the largest value for height but still the smallest weight. To achieve this:
- Change the range for age to 20 to 65 (adult)
- Enable only the Female checkbox for sex
- Set the height range to [1,1]
- Set the weight range to [0,0]
We are using the extreme ends of the available ranges to clearly see the difference in the humans generated by these two configs.
🟢 Clear the list of configs in the Human Generation Randomizer and add the two you just created.
🟢 Uncheck Uniform and set a 0.2 probability for CustomHumanGenerationConfig_1
and 0.8 for CustomHumanGenerationConfig_2
(These should always add-up to 1)
The Randomizer should look like this:
🟢 Click ▷ again. This time, about 20% of the generated humans will be short and slim women, and 80% will be tall and slim women.
Earlier we mentioned the base prefab referenced in the Human Generation Configs you have been using. The prefab is named HumanBase_AllPlacementMethods
, and humans that are generated with this base prefab are ready to be placed using a multitude of placement methods, because they have Randomizer Tag components corresponding to these methods. For instance, the TransformPlacementRandomizerTag
component causes the humans generated with this base prefab to be affected by the Transform Placement Randomizer. We will now try a few Randomizers.
🟢 Add Placement - Transform Randomizer from the Add Randomizer menu (located under the Synthetic Humans submenu), and set the following values:
Randomize Position: Enabled Volume Size X: 3 Volume Size Y: 3 Volume Size Z: 3
Leave the other settings as they are. With these values, we randomly move the humans that have a TransformPlacementRandomizerTag
component in a 3x3x3 volume centered at (0,0,0). This Randomizer can also randomize the rotation of the target objects, but we will not use that functionality for now.
🟢 In Human Generation Randomizer, set Active Humans in Each Iteration range to [5,5].
🟢 Click ▷. You will see 5 humans generated in each Iteration, randomly spread out in a 3x3x3 volume:
🟢 Add Synthetic Human Animation Randomizer to your Scenario.
The base prefab we are using for our Human Generation Configs has a Synthetic Human Animation Randomizer Tag component, with a list of animation tags added. This Randomizer Tag component will make these humans a target for the Synthetic Human Animation Randomizer. This Randomizer assigns a random animation clip to the target human.
The pool of animations to choose from depends on the Synthetic Human Animation Randomizer Tag component on the human object (in this case, on the base prefab named HumanBase_AllPlacementMethods
). In this component, if the Use Global Animation Pool option is checked, the pool of animations to choose from will be all animations found in the Animation Tag Pool chosen in the corresponding Randomizer and the selection will be made based on compatibility settings. If the Use Global Animation Pool option is disabled, the animation will be randomly selected from the immediate list of animation tags added to the Randomizer Tag component. This is the mode of operation that we will use here, so we will leave the Use Global Animation Pool option disabled.
🟢 Click ▷ to see the humans receive random animations:
The Synthetic Humans package also comes with a Randomizer for placing objects in specific positions. This Randomizer is named Point List Placement Randomizer. It takes a list of Game Objects and uses their positions to place objects tagged with PointListPlacementRandomizerTag
. This Randomizer also has the ability to randomly shift objects and randomize their rotation.
The Scene contains an object named PointPlacementAnchors, which contains four Game Objects. We will use these to try the point based placement.
🟢 Disable Transform Placement Randomizer using the checkbox located at the top left corner of the Randomizer's UI.
🟢 Add PointList Placement Randomizer to your Scenario.
🟢 Add 4 empty entries to the Anchor Points list and then drag and drop the Point objects from the Hierarchy tab into each of the empty anchor fields.
🟢 Apply these additional settings:
Randomize Rotation: Enabled Rotation Range X: [0,180] Rotation Range Y: [0,180] Rotation Range Z: [0,180] X Translation Shift: [0,0] Y Translation Shift: [0,0] Z Translation Shift: [0,0]
The Randomizer now looks like this:
With these settings, the Randomizer will only rotate the generated humans and not move their position.
🟢 Click ▷ to see the new placement Randomizer in action.
The humans will be placed on the 4 predefined positions you added to the Randomizer, and will have their rotations randomized.
ℹ️ Since we have only 4 anchor points and 5 humans to place, there will be a warning in the Console tab, saying that some of the points will be reused. In this case, the bottom left anchor point will always have two humans around it, since the Randomizer goes through the points sequentially and places humans.
The Perception and Synthetic Humans packages each provide a variety of Labelers, with the former being focused on general-purpose labelers such as 2D bounding boxes and segmentation, and the latter concentrating on human related labeling, such as human metadata and 3D keypoints. All of these are based on Perception's extensible CameraLabeler
class, which you can extend to create your own custom Labelers.
It is now time to label the humans we have been generating.
🟢 Add a PerceptionCamera
component to the Main Camera object.
At the bottom of the Inspector UI, you may note a few error and warning messages. Let's fix these:
🟢 Open Edit -> Project Settings -> Editor, and disable Asynchronous Shader Compilation.
🟢 Search for and select the asset named HDRPHighQuality
in your project, and set Lit Shader Mode to Both.
ℹ️ By default,
HDRPHighQuality
is used. If you plan to use other quality settings, make sure to apply the above fix in them too.
🟢 On the list of Camera Labelers on your Perception Camera, add the Instance Segmentation Labeler, Human Metadata Labeler, Human Mesh Labeler, Keypoint 3d Labeler, and Keypoint Labeler.
🟢 Assign the asset named SyntheticHumansSampleIdLabelConfig
to the Id Label Config fields of Instance Segmentation Labeler and Keypoint Labeler.
🟢 Enable Export Mesh Triangles on Human Mesh Labeler.
🟢 Add SyntheticHumansSampleCocoKeypointTemplate
to the Active Template field of Keypoint Labeler.
These labelers will produce:
- Instance Segmentation Labeler: Segmentation masks (.png) in which each individual labeled object has a unique color.
- Keypoint Labeler: Screen coordinates and visibility status of skeleton joints, as well as current pose.
- Human Metadata Labeler: The human properties (age, gender, ethnicity, height/weight) and assets used in generating the humans.
- Human Mesh Labeler: .obj mesh files that capture the state of the humans after they are animated and blend shapes are applied.
- Keypoint 3d Labeler: The 3D world position of the joints in the human skeleton.
If you are already familiar with Perception, you'll know that objects need to have a Labeling
component on them to be included in the labeler outputs. Also, for some Labelers like the 2D Bounding Box Labeler, the object's Labeling
component needs to have labels that are included in the Label Config used by the Labeler. The sample base Prefabs provided with Synthetic Humans all have a Labeling
component on them, with one label added: person
. This label is also present in the sample IdLabelConfig we used above.
🟢 Two of the Labelers we added support real-time visualization. Enable Show Labeler Visualizations on the Perception Camera to see this in action.
🟢 Click ▷. You will see the instance segmentation and keypoints outputs overlaid on the Game view.
🟢 To navigate to the location in which your dataset is stored, click Show Folder on the Perception Camera. The dataset is produced in SOLO format. Each Iteration of the Scenario is captured in one sequence.X
folder. Have a look at the contents of a few such folders to see what the data looks like. For documentation on the SOLO format, check out this page.
This concludes the introductory tutorial to the Synthetic Humans package. If you encountered any problems while following this tutorial, please contact the Unity CV team computer-vision[at]unity3d.com. We encourage you to browse through the rest of the pages in the Synthetic Humans wiki to learn about the rest of the tools and features the package provides.
Check out the Advanced Placement Tutorial to learn how to use NavMeshes and Placers to place humans into environments realistically.