Blueprint Nodes - Interactml/iml-unreal GitHub Wiki
Unreal Blueprints
"Blueprints" is the name of the visual scripting system built into Unreal Engine. They provide a friendly way of adding functionality to your application or game without needing to dig down into code.
images/ui/UnrealBlueprintsExample.png
InteractML systems are built within the Blueprint editor via a custom set of nodes for working with training data and machine learning models. You get to create your own ML systems whilst leveraging all the power that the library of built in (and 3rd party) Blueprints brings.
Learn more about the Unreal πBlueprint Visual Scripting system π
Adding Nodes
The custom nodes provided are under the InteractML section of the node graph context menu.
images/ui/BlueprintContextMenuMainNodes.png
Main Node Types
These four main nodes provide the majority of the functionality you will need to build machine learning systems in Unreal.
- Parameter Collection
- Example Recording
- Model Training
- Model Running
π‘NOTE: Apart from the Parameter Collection node, the main functionality of these nodes is only active when one of the (bool) control inputs is active. For proper operation, they need to have their execution inputs images/ui/Pin_Exec.png connected up and running on every engine tick, i.e. somehow fed from an Event Tick blueprint node somewhere.
See Utility Blueprints to learn about other support nodes and functionality that is available π
Parameter Collection Node
The first node you will need is the Parameter Collection node. This is used to combine the assorted input values you will be training and matching against. These are the values the drive your system. They could be player position, hand positions, head orientation, mouse position, button states, or any else that falls under the loose category of 'an input'.
The output Parameters images/ui/Pin_Struct.png is used to house the collected values together as a single 'value'. The input parameters are converted into a list of floating point values, which can then be passed around as inputs for the training set recording node or model running node.
images/ui/Node_CollectAllTheThings.png
The following Parameter types are supported by the New Parameter input pins:
- images/ui/Pin_Int.png Integer - whole numbers (-1, 0, 1, 2, 3, etc.)
- images/ui/Pin_Float.png Float - real numbers (1.0, 3.1415, -98.76, etc.)
- images/ui/Pin_Bool.png Boolean - logical values (false or true)
- images/ui/Pin_Struct.png 2D Vector - two floating point numbers (e.g. a point, size, or direction in two dimensions)
- images/ui/Pin_Vector.png 3D Vector - three floating point numbers (e.g. a point, size, or direction in three dimensions)
- images/ui/Pin_Struct.png Rotation - rotation information
- images/ui/Pin_Struct.png Colour - Red, Green, Blue (and alpha) values in floating point form (0.0 to 1.0 for each channel)
NOTE: Rotation uses quaternion representation which is better for interpolation than Euler angle representation in general, but you may find converting a direction into a unit vector (2D or 3D) produces better results.
This node also supports Arrays of values. All the above types can be collected into an array and plugged into a single input.
NOTE: Changing the quantity of items in an array could invalidate any stored training data, especially if other parameters are collected after the array.
Pins can be added as needed via the Add Pin + button on the node and removed by right clicking on the input pin and selecting Remove parameter. The type of the pin is set when you first connect to each un-typed images/ui/Pin_Wild.png New Parameter pin.
See Collecting Parameters for a more in-depth look at using this node π
Example Recording Node
Next you will need to record the examples the machine learning model will learn from. This is done using the Example Recorder node.
This takes a Training Set object to record into, the Parameters and Ouputs that are to be associated together as an example, and several control pins. The Example Recorder node also has some configuration options available in the details panel when the node is selected.
images/ui/Node_ExampleRecorder.png
See Recording Examples for a more in-depth look at using this node π
Training Set Input
The images/ui/Pin_Object.png Training Set input should be provided with a training set object to perform recording of examples. Normally you can just select an existing asset in the drop-down menu on the pin from the project somewhere but it could be passed in from some selection logic for more advanced use cases. The asset will be updated with any examples recorded in it.
See Asset Types for more on the Training Set asset π
Example Inputs
The images/ui/Pin_Struct.png Live Parameters pin is fed with parameter data (originating from a "Collect All The Things" node somewhere) and when recording is activated forms the set of values associated with the current Expected Ouput state.
The images/ui/Pin_Float.png Expected Output pin is used to specify the set of outputs a running model is expected to produce when it's inputs match the set of values currently on the Live Parameters pin. The Expected Output pin can operate in one of two modes according to the node configuration (see below):
- Simple - a single floating point value
- Composite - a set of values defined according to the configured Label type
Control Inputs
The operation of the recording node is only active when requested and there are several functions that can be performed:
- images/ui/Pin_Bool.png Record - Record an example, either a single snapshot or a series of values (if configured as a Series recorder, see below)
- images/ui/Pin_Bool.png Delete Last - Delete the newest recorded example from the training set (like an Undo for recording)
- images/ui/Pin_Bool.png Delete Output - Deletes all examples associated with the current Expected Output value
- images/ui/Pin_Bool.png Delete All - Deletes all recorded examples leaving the training set empty (as-if a new training set)
These inputs are Boolean values meaning they can be true or false. Their functions are activated when the input goes from false to true ('rising edge trigger'). Typically you can just hook up a button input to these to operate them.
For Series Recording the Record input is 'active high' meaning that it records a sequence of parameters for every update tick that the input is true. This allows sequences to be recorded into a single example and associated with the Expected Output (taken at the start of recording).
Configuration
The Example Recorder has configuration options that are not designed to be changed dynamically at runtime but are static options that majorly affect the behaviour and available input/output pins of the node. These have to be applied at edit time only as the node graph is affected by these options.
- Mode - The Example Recorder can operate in Single sampling or Series sampling mode, affecting the behaviour of the Record pin and how many samples of the Live Parameters pin are taken.
- Label Type - The Example Recorded can associate Live Parameters with either a Simple numeric (float) Expected Output or a more complex Composite Expected Output. Changing this option affects the data type of the Expected Output pin. Select one of the Labels (asset) from the drop-down UI in the details panel for the node. The "None" option will switch the Expected Output to the simple numeric mode.
Model Training Node
Once you have recorded some examples you need to train a model with them. The "Teach The Machine" node performs this function, associating a training set with a model and triggering the training process.
images/ui/Node_TeachTheMachine.png
See Training Models for a more in-depth look at using this node π
Targets
The images/ui/Pin_Object.png Model and images/ui/Pin_Object.png Training Set inputs should be provided with a model object to be trained and a training set object to provide examples to train from. Normally you can just select existing assets in the drop-down menus on the pins from the project somewhere but it could be passed in from some selection logic for more advanced use cases. The model will be trained with the provided examples and store everything it needs to then run.
See Asset Types for more on these asset types π
Control Inputs
The operation of the training node is only active when requested and there are several functions that can be performed:
- images/ui/Pin_Bool.png Train - Train the model using the provided training set of examples
- images/ui/Pin_Bool.png Reset - Reset the model to an untrained state (as-if a new asset)
These inputs are Boolean values meaning they can be true or false. Their functions are activated when the input goes from false to true ('rising edge trigger'). Typically you can just hook up a button input to these to operate them.
Outputs
The Teach The Machine node has additional outputs as follows:
- Trained images/ui/Pin_Bool.png - Get the current state of the connected model, whether it is trained (true) or untrained (false).
Configuration
The training node has configuration options that are not designed to be changed dynamically at runtime but are static options that majorly affect the behaviour and available input/output pins of the node. These have to be applied at edit time only as the node graph is affected by these options.
- Background Operation - The training node can operate in the background (asynchronously), see below for details.
Background Operation
The training node can operate in the background (asynchronously), to avoid stalling the main engine thread and rendering. This is needed when training operations are long or an application can't afford any rendering interruptions (e.g. VR). Enabling this changes the operating mode and provides some additional outputs:
images/ui/Node_TeachTheMachine_Async.png
- Training images/ui/Pin_Bool.png - Is the model currently being trained? (true whilst training ongoing in the background)
- Completed images/ui/Pin_Exec.png - Execution output that is fired for one tick after the training has completed
Model Running Node
To then use the trained model you need to add the "Machine Learning Robot" node to a graph. This is the end product of all the above work and represents a straightforward functional unit you can drop into graphs as a single node for recognising patterns in parameter data and driving other systems with the results.
The node takes a model asset and some current live parameters and provides the output closest to the one it recognises from the examples it was trained on.
images/ui/Node_MachineLearningRobot.png
See Running Models for a more in-depth look at using this node π
Targets
The images/ui/Pin_Object.png Model input should be provided with a model object to run and the images/ui/Pin_Struct.png Live Parameters input fed from a parameter collection node (directly or indirectly). Normally you can just select existing assets in the drop-down menus on the pins from the project somewhere but it could be passed in from some selection logic for more advanced use cases. The model will run on the live input values and produce output values accordingly as it recognised patterns in the live values it learned from the training stage.
See Asset Types for more on the model asset type π
Control Inputs
The model is only run when requested according to the Run input pin:
- images/ui/Pin_Bool.png Run - Run the model against the current input parameters to produce an appropriate output
This input is a Boolean value meaning it can be true or false.
- Single Sampling - For models that operate on a single sample the run function is triggered on any frame tick where the input is true. Typically you can just hook up a toggle button input to this to turn the run on or off, alternately for standalone operation it can just be left set to true.
- Series Sampling - For models that operate on a series of samples then they are accumulated internally all the time the Run input is true and only when it is set false ('falling edge trigger') will the model be run on the accumulated sequence of parameters. After a run the accumulated input is cleared ready for another accumulation run.
Output
The Machine Learning Robot node has the following output:
- Output images/ui/Pin_Float.png - Result of running the model on the input parameters, one of the expected outputs trained with or a mix of them (according to whether it's a discrete or continuous mode model).
Configuration
The running node has configuration options that are not designed to be changed dynamically at runtime but are static options that majorly affect the behaviour and available input/output pins of the node. These have to be applied at edit time only as the node graph is affected by these options.
- Label Type - The type of output expected, this is either Simple (float) output or Composite (structured) output and should match the type of Expected Output used when recording the training set the model was trained on.
- Background Operation - The running node can operate in the background (asynchronously), see below for details.
Background Operation
The running node can operate in the background (asynchronously), to avoid stalling the main engine thread and rendering. This is needed when running the model takes a while or an application can't afford any rendering interruptions (e.g. VR). Enabling this changes the operating mode and provides some additional outputs:
images/ui/Node_MachineLearningRobot_Async.png
- Running images/ui/Pin_Bool.png - Is the model currently running? (true whilst a run is ongoing in the background)
- Completed images/ui/Pin_Exec.png - Execution output that is fired for one tick after the run has completed on the current input parameter set.
π [Asset Types]] ](/Interactml/iml-unreal/wiki/π -[[Home) | Collecting Parameters π