Technical - Interactml/iml-unreal GitHub Wiki
Overview
A guide to help you understand the design, the structure, and the operation of the InteractML plugin within Unreal and continue its maintenance and modification in the future.
The InteractML plugin is implemented in C++ as a plugin for Unreal Engine. This is split into three modules, handling the run-time operation, the blueprint nodes, and the editor integration.
Development
Plugin & Projects
Unreal plug-in modules live in a Plugins folder within your Unreal Project, e.g. alongside it’s Content and Config folders. For development the plugin will always need to be part of a project. When shipped, the plugin gets promoted to an Engine Plugin. Typically a plugin is optional on a per-project basis and is added/removed using the Epic Launcher or in the project plugin settings within the Unreal Editor itself. Also, the demo project has been reverted to Blueprint only for simplicity and doesn't contain any C++ code itself. This situation has some implications:
- To use the demo project with the released plugin you will need to temporarily delete the development plugin (
Plugins\InteractML
folder). - To use the demo project to develop/maintain the plugin you will need to temporarily convert it back to a C++ enabled project (Add Class in editor).
- You can't develop the plugin against a version of Unreal Engine that you have the released plugin installed so you will need to temporarily remove it or work against a (later) that doesn't have it installed.
💡 In the future we may split the demo part of the project away from the plugin development part, as a separate repository.
Building
You need to build the plugin in the context of a C++ enabled project, to do this:
- Right click on the project file (e.g.
InteractMLUE.uproject
) and select Generate Visual Studio project files - Right click on the generated
InteractMLUE.sln
file and select Open with --> Microsoft Visual Studio 2019 - Select the build configuration you need (see table below)
- Select Build --> Rebuild Solution*
This will build any project code (/Source
) and results in EXE and DLL output to the /Binaries
folder, as well as the plugin itself with its module DLL files output to the /Plugins/InteractML/Binaries
folder.
Launching Unreal
The project can be launched/opened in the Unreal Editor in several ways:
- Select Debug → Start Debugging in Visual Studio.
- Press F5 in Visual Studio.
- Double click on the InteractMLUE.uproject file. NOTE: This only works as expected if you have built the
Development Editor|Win64
configuration and target.
Build Configurations
Name | Modules | Optimisation | Debug Info. | Purpose | |
---|---|---|---|---|---|
DebugGame | Runtime | Unoptimised (slow) | Yes | Application | |
DebugGame Editor | All | Unoptimised (slow) | Yes | Editor | |
Development | Runtime | Optimised | Yes | Application | |
* | Development Editor | All | Optimised | Yes | Editor |
Shipping | Runtime | Optimised | No | Application |
* This the configuration you usually want
Build Targets
The following build configurations are available from an Unreal project solution (by default, building on Windows):
Name | OS | Architecture | Notes | |
---|---|---|---|---|
Win32* | Windows | 32 bits | Only for standalone applications that need to target older Windows systems. | |
* | Win64 | Windows | 64 bits | Normal target. Editor only works on 64 bit. |
* This is the target you usually want
⚠️ Win32 is no-longer supported as-of UE 5.0 so it’s been dropped from InteractML altogether for now. We could add support it for the 4.x versions back in but it requires slightly fiddly
InteractML.Build.cs
andpackage_plugin.cmd
changes (left as an exercise for the reader).
Building RapidLib
The RapidLib Machine Learning library is built separately from the project and plugin. This is because the resulting .lib files need to be shipped with the plugin source code in order to be able to build that independently of RapidLib source. This is largely for convenience, but it means: a) you don’t have to ship the RapidLib source with the plugin, and b) it may be tricky to get RapidLib building within the InteractML project (build settings/warning levels/defines).
Normally you don’t need to build RapidLib as the libraries are pre-built and checked in with the plugin source code (in \Plugins\InteractML\Source\3rdParty\RapidLib\lib
). To build RapidLib there is a separate Visual Studio solution file within the plugin.
- Navigate to and open the solution in Visual Studio 2019
\Plugins\InteractML\Source\3rdParty\RapidLib\vs2019\RapidLib.sln
- Select Build → Batch Build
- Tick all four configurations (or Select All)
- Click Rebuild
- Check all completed successfully and the four .lib files are updated
- Build the main project to check it’s working fully
- Submit the changed .lib files to source control
💡 The Debug builds are linked against the Release C Runtime because Unreal doesn’t use the Debug CRT in Debug builds (perf). It is still un-optimised to help debugging. See InteractMLUE.Build.cs for details of how the RapidLib version is selected.
💡 The original RapidLib source has it's own build setup (cmake based) but we have a custom build because some of the build settings need to be set up differently to operate correctly with Unreal.
Building Standalone Application
It is possible to build any of the demo maps into a standalone application as follows:
- Open the project in Unreal Editor.
- Navigate to Edit → Project Settings
- Project → Description
- Optionally fill in the project description fields
- Select if you want it to “Start in VR” (i.e. for the Semaphore demo)
- Project → Maps & Modes
- Set Game Default Map to the demo level you want e.g. “RegressionDemo”
- Projects → Packaging
- If you have your Examples/Model data files outside the Content folder (usual) you need to add the relative path to the “Additional Non-Asset Directories To Copy” (Packaging advanced options) for the default location alongside the Content folder, e.g. ../Data. NOTE: Don’t use the button to select the path, you need to type relative paths into the text box.
- Projects → Supported Platforms
- Set to the platforms you support, e.g. Windows (64-bit)
- Projects → Target Hardware
- Normally this is just Desktop/Console and Maximum Quality, but you may need other settings if, for example, you are targeting mobile/Quest.
- Select File → Package Project → Windows (64-bit)
- Select a new/previous folder to put the built version in
- Wait for the packaging process to complete (some minutes)
- If there are any build problems look through the Output Log window
- You should now have a standalone build that will run on it’s own
💡 The build process normally builds all the project and plugin code from scratch using the Shipping configuration so this needs to be known to compile.
Packaging
Versions
At release we support multiple unreal engine versions (4.25, 4.26, and 4.27), but for code plugins you need to submit separate builds to the marketplace product page, one for each version. For our build, currently, there are no code changes needed for all versions and so the difference between the builds is largely cosmetic. Only the version in the .uplugin
file needs to change. The Tools/build.cmd
file defines the engine versions we support and package for, new versions can be added here. The build scripts inject the engine version number into the .uplugin
file for each version built as EngineVersion
.
We also maintain a plugin version number purely for tracking and separating the uploaded files. This is set in the Tools\build.cmd
file and injected into the .uplugin
file as Version
and VersionName
.
There are now scripts for automatically packaging and preparing multiple plugin versions.
- Ensure you have an up-to-date branch (usually built from master)
- Update the version number in
Tools/build.cmd
if needed (new features/fixes) - Run
Tools/build.cmd
(you can double-click on it) - Wait for it to run through the build checks and packaging process (a few mins)
- On success
- The resulting .ZIP files should appear in the Builds folder
- These are the files you will upload and refer to on the Marketplace (see Upload/Link section below)
- On failure
- An error message should be visible somewhere in the log
- The script window should remain open until you press a key
⚠️ It is beyond the scope of this documentation to cover the Marketplace upload process and the specifics of using the InteractML account.
If you have access, you can refer to the 🔗InteractML Code Manual on Google Drive for a complete rundown of plugin maintenance and the release process.
Design
The process flow for the InteractML system is described by this flow diagram.
images/diagrams/Diagram_MachineLearningFlow.png
- Live Parameters from the world and user are either:
- Recorded directly
- Used directly with a running model
- Accumulated in batches before being used with a running model
- Recorded examples are stored in a Training Set for later use
- Models are trained with a training set
- Trained models are stored in a model asset for later use
- A model is run to generate results
Storage
Model and Training Set objects are backed by JSON data files, these are managed and stored in the same way so it made sense to have a common base for these asset types. This is the InteractMLStorage
class. It handles the following:
- External storage file path
- Synchronisation upon datafile or asset changes
- Loading & Saving of JSON data
- Change tracking
Model
Based on the InteractMLStorage
class, serves as a common base for each Model type. Manages operation of, and serves as, abstract interface for Create, Load, Save, Type, Mode, Training, Running, and asynchronous operation. Derived model types override various methods to configure and specialise.
Derived types:
InteractMLClassificationModel
,InteractMLRegressionModel
, andInteractMLDynamicTimeWarpModel
.
Training Set
Based on the InteractMLStorage
class, serves as a container for example data and expected outputs.
Context
Serves as an operating context for InteractML Blueprint nodes used in graphs within actors. These are injected into containing actor to provide session state for running the ML nodes in the graph. These are needed to maintain coordination and pass information between nodes in what is effectively a stateless function graph.
images/diagrams/Diagram_StorageAndContext.png
Model State
One of the main objects that the Context manages. This provides storage to keep state shared between nodes and between update frames of a model operating in Blueprint graphs. Because one Model can be reused multiple times independently and in multiple graphs a place to keep each instances operating state is needed.
Labels
Based on the build-in Unreal type UUserDefinedStruct
these are a list of named, typed variables (and their defaults). This also handles the conversion of an actual structure instance into a stream of float values for capture, and the re-creation back from floats to structured data again.
images/diagrams/Diagram_LabelPropagation.png
Label Table
Very basic wrapper around built-in Unreal type UDataTable
. Exactly the same function, but 'rebranded' to fit into the InteractML ecosystem.
Label Cache
During example recording all unique Labels are captured so that they can be properly associated with the example parameters they are given with. This is needed because some models only support a single input value, and we want to be able to recreate the source label value (Expected Output) when the model runs as the same label type (Output).
A label cache contains several parts:
- Label Type - The label asset type (structure definition) all entries are based on
- Captured Values - The specific values of each label instance, as a stream of floats
- String Map - Strings can't be converted to floats so they are stored and just indexed instead
Task
To support asynchronous operation all Training and Running operations are wrapped up as a 'task' object (InteractML specific, not to be confused with other conceptually similar uses of the word 'Task'). These contain a copy of all the information needed to perform the operation for a given model. They are generated by the Model objects in response to a BeginTraining or BeginRUnning request. The same mechanism is used for Synchronous operation except that the task is run/completed immediately instead of being dispatched to the FInteractMLModule::RunTask
method for background scheduling.
The task operation is split into three phases:
- Begin - The task creation process, all data setup/capture is performed here - this happens on the main thread
- Run - The functionality to perform in the background, e.g. train or run the model - this happens on a background thread
- Apply - Results of the run phase are presented back to the source of the task for handling - this happens on the main thread
Code Files
The following are brief descriptions of build and header files in the project, they all have equivalent .cpp
files for their implementations.
Runtime
The main operational part of the plugin is the Runtime Module. This is the heart of InteractML and handles integration into the engine, interoperating with the RapidLib library and providing all of the main ML plugin functions and asset types.
InteractML\InteractML.Build.cs
- how Unreal should go about building this moduleInteractML\Public\InteractML.h
- the main module entry point, init/setupInteractML\Public\InteractMLBlueprintLibrary.h
- interfacing functions for blueprint useInteractML\Public\InteractMLContext.h
- manages per-node state e.g. ModelState.InteractML\Public\InteractMLHelpers.h
- utility (action trigger handling)InteractML\Public\InteractMLLabel.h
- composite label type (similar to enums)InteractML\Public\InteractMLLabelCache.h
- mapping of specific label instance to an indexInteractML\Public\InteractMLLabelTable.h
- collections of label instances (like Data Tables)InteractML\Public\InteractMLModel.h
- common code for asset of each ML model typeInteractML\Public\InteractMLModelState.h
- per-node state for parameter collection, recording to training sets, and running models.InteractML\Public\InteractMLParameters.h
- collection of values to train/recognise against.InteractML\Public\InteractMLStorage.h
- base for models and training set, handles file IOInteractML\Public\InteractMLTask.h
- common handler for processing tasks, enables asyncInteractML\Public\InteractMLTrainingSet.h
- asset type to store recorded examplesInteractML\Public\Models\InteractMLClassificationModel.h
- model specialisationInteractML\Public\Models\InteractMLDynamicTimeWarpModel.h
- model specialisationInteractML\Public\Models\InteractMLRegressionModel.h
- model specialisation
Scripting
The scripting module lives in-between the Runtime and Editor modules. It is not needed at run-time, but is needed during the build process (compilation of blueprints). This happens both in-editor and during a build of the standalone game (which can be done outside the editor).
Most of the functionality here are Node classes that are responsible for assembling the custom Blueprint nodes into ‘wired up’ functional blocks, and managing how the nodes appear when in the editor.
InteractMLScripting\InteractMLScripting.Build.cs
- how Unreal should build this moduleInteractMLScripting\Public\InteractMLScripting.h
- the main module entry point, init/setupInteractMLScripting\Private\InteractMLConstants.h
- some shared constant configInteractMLScripting\Public\InteractMLNode.h
- some common functionality shared between the different InteractML node types (e.g. appearance)InteractMLScripting\Public\InteractMLExternalModelNode.h
- a custom node typeInteractMLScripting\Public\InteractMLExternalTrainingSetNode.h
- a custom node typeInteractMLScripting\Public\InteractMLParameterNode.h
- a custom node typeInteractMLScripting\Public\InteractMLRecordingNode.h
- a custom node typeInteractMLScripting\Public\InteractMLRunningNode.h
- a custom node typeInteractMLScripting\Public\InteractMLTrainingNode.h
- a custom node type
Editor
Editing functionality and display is handled in a separate module and not used during standalone plugin operation. This provides object factories, menu hooks, UI customisations, custom editing UI, and central editing support.
InteractMLEditor\InteractMLEditor.Build.cs
- how Unreal should build this moduleInteractMLEditor\Public\InteractMLEditor.h
- the main module entry point, init/setupInteractMLEditor\Private\InteractMLLabelActions.h
- asset appearance and behaviourInteractMLEditor\Private\InteractMLLabelTableActions.h
- asset appearance and behaviourInteractMLEditor\Private\InteractMLModelActions.h
- asset appearance and behaviourInteractMLEditor\Private\InteractMLTrainingSetActions.h
- asset appearance and behaviourInteractMLEditor\Private\InteractMLTrainingSetEditor.h
- this has a custom asset editor panelInteractMLEditor\Public\InteractMLLabelFactory.h
- creation of this asset typeInteractMLEditor\Public\InteractMLLabelTableFactory.h
- creation of this asset typeInteractMLEditor\Public\InteractMLModelFactory.h
- creation of this asset typeInteractMLEditor\Public\InteractMLTrainingSetFactory.h
- creation of this asset type
Blueprint Nodes
Unreal allows for the creation of custom Blueprint node types. These are used to implement some of the more advanced InteractML nodes.
Custom Blueprint Nodes
A brief explanation of how custom Blueprint nodes work.
- This is how blueprints operate internally; they are just a load of function calls.
- The custom code for the node controls how they are composed, what inputs and outputs are connected, and the order they are executed.
- Custom blueprint nodes can have multiple functions chained together internally to implement their behaviour.
- This structure is fixed at blueprint compile time and isn’t dynamic at runtime (the custom node doesn’t even exist at runtime, all of a graphs nodes get collapsed down into one big series of function calls and parameter passing).
- It can however be dynamic at edit time, for example depending on the number of inputs you have, and even if they are connected.
images/diagrams/Diagram_UnrealCustomBlueprints.png
Below is a breakdown of how some of the custom nodes built for InteractML are put together and operate.
Parameter Node
The Parameter Node collects together values of various differing types and adds them to a Parameter Collection structure for the recording and running processes. The node has a dynamic input list that can be added to and removed from. The list of pins drives the internal node generation process. After a function call to obtain a Parameter Collection object, each pin causes a utility function call to add it's own type of value to the collection. These calls are chained together until it is complete. The Parameter Collection is presented at the output for use elsewhere in the graph.
The InteractMLBlueprintLibrary
class hosts a number of interoperation functions that the nodes use internally (as well as some other utility functions available to the user). The parameter add functions are as follows:
void AddIntegerParameter( FInteractMLParameters Parameters, int Value );
void AddFloatParameter( FInteractMLParameters Parameters, float Value );
void AddBooleanParameter( FInteractMLParameters Parameters, bool Value );
void AddVector2Parameter( FInteractMLParameters Parameters, FVector2D Value );
void AddVector3Parameter( FInteractMLParameters Parameters, FVector Value );
void AddQuaternionParameter( FInteractMLParameters Parameters, FQuat Value );
void AddColourParameter( FInteractMLParameters Parameters, FLinearColor Value );
images/diagrams/Diagram_ParameterCollectionNode.png
Recording Node
The recording node is largely a wrapper for the main Record function and most of the IO maps directly to the function parameter list.
images/diagrams/Diagram_RecordTrainingSetNode.png
Points of note:
- The Actor input is configurable, it normally is automatically hooked up to the nodes containing Actor, but can be explicitly provided.
- The internal graph node unique ID is used to provide operating context for the function call (see Context object above).
- There are multiple record functions used, selected by the nodes Mode (single/series) configuration property and the type of the Expected Output input (simple/composite type).
Training Node
The Train Model node operates in two different ways depending on whether it is configured to operate synchronously or asynchronously. Synchronous operation is basically a wrapper for the Train function (like the previous node), but asynchronous operation calls a different function and has some additional logic after it to provide a brief execution path on completion of the training operation.
images/diagrams/Diagram_TrainModelNode.png
Running Node
Like the Training Node, the Run node operates in two modes; synchronous and asynchronous with additional logic for async operation.
images/diagrams/Diagram_RunModelNode.png
External Training Set
A wrapper for the External training set function providing node ID context.
images/diagrams/Diagram_ExternalTrainingSetNode.png
External Model
A wrapper for the External Model function providing node ID context.