Technical - Interactml/iml-unreal GitHub Wiki

Overview

A guide to help you understand the design, the structure, and the operation of the InteractML plugin within Unreal and continue its maintenance and modification in the future.

The InteractML plugin is implemented in C++ as a plugin for Unreal Engine. This is split into three modules, handling the run-time operation, the blueprint nodes, and the editor integration.

Development

Plugin & Projects

Unreal plug-in modules live in a Plugins folder within your Unreal Project, e.g. alongside it’s Content and Config folders. For development the plugin will always need to be part of a project. When shipped, the plugin gets promoted to an Engine Plugin. Typically a plugin is optional on a per-project basis and is added/removed using the Epic Launcher or in the project plugin settings within the Unreal Editor itself. Also, the demo project has been reverted to Blueprint only for simplicity and doesn't contain any C++ code itself. This situation has some implications:

  1. To use the demo project with the released plugin you will need to temporarily delete the development plugin (Plugins\InteractML folder).
  2. To use the demo project to develop/maintain the plugin you will need to temporarily convert it back to a C++ enabled project (Add Class in editor).
  3. You can't develop the plugin against a version of Unreal Engine that you have the released plugin installed so you will need to temporarily remove it or work against a (later) that doesn't have it installed.

💡 In the future we may split the demo part of the project away from the plugin development part, as a separate repository.

Building

You need to build the plugin in the context of a C++ enabled project, to do this:

  1. Right click on the project file (e.g. InteractMLUE.uproject) and select Generate Visual Studio project files
  2. Right click on the generated InteractMLUE.sln file and select Open with --> Microsoft Visual Studio 2019
  3. Select the build configuration you need (see table below)
  4. Select Build --> Rebuild Solution*

This will build any project code (/Source) and results in EXE and DLL output to the /Binaries folder, as well as the plugin itself with its module DLL files output to the /Plugins/InteractML/Binaries folder.

Launching Unreal

The project can be launched/opened in the Unreal Editor in several ways:

  1. Select Debug → Start Debugging in Visual Studio.
  2. Press F5 in Visual Studio.
  3. Double click on the InteractMLUE.uproject file. NOTE: This only works as expected if you have built the Development Editor|Win64 configuration and target.

Build Configurations

Name Modules Optimisation Debug Info. Purpose
DebugGame Runtime Unoptimised (slow) Yes Application
DebugGame Editor All Unoptimised (slow) Yes Editor
Development Runtime Optimised Yes Application
* Development Editor All Optimised Yes Editor
Shipping Runtime Optimised No Application

* This the configuration you usually want

Build Targets

The following build configurations are available from an Unreal project solution (by default, building on Windows):

Name OS Architecture Notes
Win32* Windows 32 bits Only for standalone applications that need to target older Windows systems.
* Win64 Windows 64 bits Normal target. Editor only works on 64 bit.

* This is the target you usually want

⚠️ Win32 is no-longer supported as-of UE 5.0 so it’s been dropped from InteractML altogether for now. We could add support it for the 4.x versions back in but it requires slightly fiddly InteractML.Build.cs and package_plugin.cmd changes (left as an exercise for the reader).

Building RapidLib

The RapidLib Machine Learning library is built separately from the project and plugin. This is because the resulting .lib files need to be shipped with the plugin source code in order to be able to build that independently of RapidLib source. This is largely for convenience, but it means: a) you don’t have to ship the RapidLib source with the plugin, and b) it may be tricky to get RapidLib building within the InteractML project (build settings/warning levels/defines).

Normally you don’t need to build RapidLib as the libraries are pre-built and checked in with the plugin source code (in \Plugins\InteractML\Source\3rdParty\RapidLib\lib). To build RapidLib there is a separate Visual Studio solution file within the plugin.

  1. Navigate to and open the solution in Visual Studio 2019 \Plugins\InteractML\Source\3rdParty\RapidLib\vs2019\RapidLib.sln
  2. Select Build → Batch Build
  3. Tick all four configurations (or Select All)
  4. Click Rebuild
  5. Check all completed successfully and the four .lib files are updated
  6. Build the main project to check it’s working fully
  7. Submit the changed .lib files to source control

💡 The Debug builds are linked against the Release C Runtime because Unreal doesn’t use the Debug CRT in Debug builds (perf). It is still un-optimised to help debugging. See InteractMLUE.Build.cs for details of how the RapidLib version is selected.

💡 The original RapidLib source has it's own build setup (cmake based) but we have a custom build because some of the build settings need to be set up differently to operate correctly with Unreal.

Building Standalone Application

It is possible to build any of the demo maps into a standalone application as follows:

  1. Open the project in Unreal Editor.
  2. Navigate to Edit → Project Settings
  3. Project → Description
    1. Optionally fill in the project description fields
    2. Select if you want it to “Start in VR” (i.e. for the Semaphore demo)
  4. Project → Maps & Modes
    1. Set Game Default Map to the demo level you want e.g. “RegressionDemo”
  5. Projects → Packaging
    1. If you have your Examples/Model data files outside the Content folder (usual) you need to add the relative path to the “Additional Non-Asset Directories To Copy” (Packaging advanced options) for the default location alongside the Content folder, e.g. ../Data. NOTE: Don’t use the button to select the path, you need to type relative paths into the text box.
  6. Projects → Supported Platforms
    1. Set to the platforms you support, e.g. Windows (64-bit)
  7. Projects → Target Hardware
    1. Normally this is just Desktop/Console and Maximum Quality, but you may need other settings if, for example, you are targeting mobile/Quest.
  8. Select File → Package Project → Windows (64-bit)
  9. Select a new/previous folder to put the built version in
  10. Wait for the packaging process to complete (some minutes)
  11. If there are any build problems look through the Output Log window
  12. You should now have a standalone build that will run on it’s own

💡 The build process normally builds all the project and plugin code from scratch using the Shipping configuration so this needs to be known to compile.

Packaging

Versions

At release we support multiple unreal engine versions (4.25, 4.26, and 4.27), but for code plugins you need to submit separate builds to the marketplace product page, one for each version. For our build, currently, there are no code changes needed for all versions and so the difference between the builds is largely cosmetic. Only the version in the .uplugin file needs to change. The Tools/build.cmd file defines the engine versions we support and package for, new versions can be added here. The build scripts inject the engine version number into the .uplugin file for each version built as EngineVersion.

We also maintain a plugin version number purely for tracking and separating the uploaded files. This is set in the Tools\build.cmd file and injected into the .uplugin file as Version and VersionName.

There are now scripts for automatically packaging and preparing multiple plugin versions.

  1. Ensure you have an up-to-date branch (usually built from master)
  2. Update the version number in Tools/build.cmd if needed (new features/fixes)
  3. Run Tools/build.cmd (you can double-click on it)
  4. Wait for it to run through the build checks and packaging process (a few mins)
  5. On success
    1. The resulting .ZIP files should appear in the Builds folder
    2. These are the files you will upload and refer to on the Marketplace (see Upload/Link section below)
  6. On failure
    1. An error message should be visible somewhere in the log
    2. The script window should remain open until you press a key

⚠️ It is beyond the scope of this documentation to cover the Marketplace upload process and the specifics of using the InteractML account.

If you have access, you can refer to the 🔗InteractML Code Manual on Google Drive for a complete rundown of plugin maintenance and the release process.

Design

The process flow for the InteractML system is described by this flow diagram.

images/diagrams/Diagram_MachineLearningFlow.png

  • Live Parameters from the world and user are either:
    • Recorded directly
    • Used directly with a running model
    • Accumulated in batches before being used with a running model
  • Recorded examples are stored in a Training Set for later use
  • Models are trained with a training set
  • Trained models are stored in a model asset for later use
  • A model is run to generate results

Storage

Model and Training Set objects are backed by JSON data files, these are managed and stored in the same way so it made sense to have a common base for these asset types. This is the InteractMLStorage class. It handles the following:

  • External storage file path
  • Synchronisation upon datafile or asset changes
  • Loading & Saving of JSON data
  • Change tracking

Model

Based on the InteractMLStorage class, serves as a common base for each Model type. Manages operation of, and serves as, abstract interface for Create, Load, Save, Type, Mode, Training, Running, and asynchronous operation. Derived model types override various methods to configure and specialise.

Derived types: InteractMLClassificationModel, InteractMLRegressionModel, and InteractMLDynamicTimeWarpModel.

Training Set

Based on the InteractMLStorage class, serves as a container for example data and expected outputs.

Context

Serves as an operating context for InteractML Blueprint nodes used in graphs within actors. These are injected into containing actor to provide session state for running the ML nodes in the graph. These are needed to maintain coordination and pass information between nodes in what is effectively a stateless function graph.

images/diagrams/Diagram_StorageAndContext.png

Model State

One of the main objects that the Context manages. This provides storage to keep state shared between nodes and between update frames of a model operating in Blueprint graphs. Because one Model can be reused multiple times independently and in multiple graphs a place to keep each instances operating state is needed.

Labels

Based on the build-in Unreal type UUserDefinedStruct these are a list of named, typed variables (and their defaults). This also handles the conversion of an actual structure instance into a stream of float values for capture, and the re-creation back from floats to structured data again.

images/diagrams/Diagram_LabelPropagation.png

Label Table

Very basic wrapper around built-in Unreal type UDataTable. Exactly the same function, but 'rebranded' to fit into the InteractML ecosystem.

Label Cache

During example recording all unique Labels are captured so that they can be properly associated with the example parameters they are given with. This is needed because some models only support a single input value, and we want to be able to recreate the source label value (Expected Output) when the model runs as the same label type (Output).

A label cache contains several parts:

  • Label Type - The label asset type (structure definition) all entries are based on
  • Captured Values - The specific values of each label instance, as a stream of floats
  • String Map - Strings can't be converted to floats so they are stored and just indexed instead

Task

To support asynchronous operation all Training and Running operations are wrapped up as a 'task' object (InteractML specific, not to be confused with other conceptually similar uses of the word 'Task'). These contain a copy of all the information needed to perform the operation for a given model. They are generated by the Model objects in response to a BeginTraining or BeginRUnning request. The same mechanism is used for Synchronous operation except that the task is run/completed immediately instead of being dispatched to the FInteractMLModule::RunTask method for background scheduling.

The task operation is split into three phases:

  1. Begin - The task creation process, all data setup/capture is performed here - this happens on the main thread
  2. Run - The functionality to perform in the background, e.g. train or run the model - this happens on a background thread
  3. Apply - Results of the run phase are presented back to the source of the task for handling - this happens on the main thread

Code Files

The following are brief descriptions of build and header files in the project, they all have equivalent .cpp files for their implementations.

Runtime

The main operational part of the plugin is the Runtime Module. This is the heart of InteractML and handles integration into the engine, interoperating with the RapidLib library and providing all of the main ML plugin functions and asset types.

  • InteractML\InteractML.Build.cs - how Unreal should go about building this module
  • InteractML\Public\InteractML.h - the main module entry point, init/setup
  • InteractML\Public\InteractMLBlueprintLibrary.h - interfacing functions for blueprint use
  • InteractML\Public\InteractMLContext.h - manages per-node state e.g. ModelState.
  • InteractML\Public\InteractMLHelpers.h - utility (action trigger handling)
  • InteractML\Public\InteractMLLabel.h - composite label type (similar to enums)
  • InteractML\Public\InteractMLLabelCache.h - mapping of specific label instance to an index
  • InteractML\Public\InteractMLLabelTable.h - collections of label instances (like Data Tables)
  • InteractML\Public\InteractMLModel.h - common code for asset of each ML model type
  • InteractML\Public\InteractMLModelState.h - per-node state for parameter collection, recording to training sets, and running models.
  • InteractML\Public\InteractMLParameters.h - collection of values to train/recognise against.
  • InteractML\Public\InteractMLStorage.h - base for models and training set, handles file IO
  • InteractML\Public\InteractMLTask.h - common handler for processing tasks, enables async
  • InteractML\Public\InteractMLTrainingSet.h - asset type to store recorded examples
  • InteractML\Public\Models\InteractMLClassificationModel.h - model specialisation
  • InteractML\Public\Models\InteractMLDynamicTimeWarpModel.h - model specialisation
  • InteractML\Public\Models\InteractMLRegressionModel.h - model specialisation

Scripting

The scripting module lives in-between the Runtime and Editor modules. It is not needed at run-time, but is needed during the build process (compilation of blueprints). This happens both in-editor and during a build of the standalone game (which can be done outside the editor).

Most of the functionality here are Node classes that are responsible for assembling the custom Blueprint nodes into ‘wired up’ functional blocks, and managing how the nodes appear when in the editor.

  • InteractMLScripting\InteractMLScripting.Build.cs - how Unreal should build this module
  • InteractMLScripting\Public\InteractMLScripting.h - the main module entry point, init/setup
  • InteractMLScripting\Private\InteractMLConstants.h - some shared constant config
  • InteractMLScripting\Public\InteractMLNode.h - some common functionality shared between the different InteractML node types (e.g. appearance)
  • InteractMLScripting\Public\InteractMLExternalModelNode.h - a custom node type
  • InteractMLScripting\Public\InteractMLExternalTrainingSetNode.h - a custom node type
  • InteractMLScripting\Public\InteractMLParameterNode.h - a custom node type
  • InteractMLScripting\Public\InteractMLRecordingNode.h - a custom node type
  • InteractMLScripting\Public\InteractMLRunningNode.h - a custom node type
  • InteractMLScripting\Public\InteractMLTrainingNode.h - a custom node type

Editor

Editing functionality and display is handled in a separate module and not used during standalone plugin operation. This provides object factories, menu hooks, UI customisations, custom editing UI, and central editing support.

  • InteractMLEditor\InteractMLEditor.Build.cs - how Unreal should build this module
  • InteractMLEditor\Public\InteractMLEditor.h - the main module entry point, init/setup
  • InteractMLEditor\Private\InteractMLLabelActions.h - asset appearance and behaviour
  • InteractMLEditor\Private\InteractMLLabelTableActions.h - asset appearance and behaviour
  • InteractMLEditor\Private\InteractMLModelActions.h - asset appearance and behaviour
  • InteractMLEditor\Private\InteractMLTrainingSetActions.h - asset appearance and behaviour
  • InteractMLEditor\Private\InteractMLTrainingSetEditor.h - this has a custom asset editor panel
  • InteractMLEditor\Public\InteractMLLabelFactory.h - creation of this asset type
  • InteractMLEditor\Public\InteractMLLabelTableFactory.h - creation of this asset type
  • InteractMLEditor\Public\InteractMLModelFactory.h - creation of this asset type
  • InteractMLEditor\Public\InteractMLTrainingSetFactory.h - creation of this asset type

Blueprint Nodes

Unreal allows for the creation of custom Blueprint node types. These are used to implement some of the more advanced InteractML nodes.

Custom Blueprint Nodes

A brief explanation of how custom Blueprint nodes work.

  • This is how blueprints operate internally; they are just a load of function calls.
  • The custom code for the node controls how they are composed, what inputs and outputs are connected, and the order they are executed.
  • Custom blueprint nodes can have multiple functions chained together internally to implement their behaviour.
  • This structure is fixed at blueprint compile time and isn’t dynamic at runtime (the custom node doesn’t even exist at runtime, all of a graphs nodes get collapsed down into one big series of function calls and parameter passing).
  • It can however be dynamic at edit time, for example depending on the number of inputs you have, and even if they are connected.

images/diagrams/Diagram_UnrealCustomBlueprints.png

Below is a breakdown of how some of the custom nodes built for InteractML are put together and operate.

Parameter Node

The Parameter Node collects together values of various differing types and adds them to a Parameter Collection structure for the recording and running processes. The node has a dynamic input list that can be added to and removed from. The list of pins drives the internal node generation process. After a function call to obtain a Parameter Collection object, each pin causes a utility function call to add it's own type of value to the collection. These calls are chained together until it is complete. The Parameter Collection is presented at the output for use elsewhere in the graph.

The InteractMLBlueprintLibrary class hosts a number of interoperation functions that the nodes use internally (as well as some other utility functions available to the user). The parameter add functions are as follows:

void AddIntegerParameter( FInteractMLParameters Parameters, int Value );
void AddFloatParameter( FInteractMLParameters Parameters, float Value );
void AddBooleanParameter( FInteractMLParameters Parameters, bool Value );
void AddVector2Parameter( FInteractMLParameters Parameters, FVector2D Value );
void AddVector3Parameter( FInteractMLParameters Parameters, FVector Value );
void AddQuaternionParameter( FInteractMLParameters Parameters, FQuat Value );
void AddColourParameter( FInteractMLParameters Parameters, FLinearColor Value );

images/diagrams/Diagram_ParameterCollectionNode.png

Recording Node

The recording node is largely a wrapper for the main Record function and most of the IO maps directly to the function parameter list.

images/diagrams/Diagram_RecordTrainingSetNode.png

Points of note:

  • The Actor input is configurable, it normally is automatically hooked up to the nodes containing Actor, but can be explicitly provided.
  • The internal graph node unique ID is used to provide operating context for the function call (see Context object above).
  • There are multiple record functions used, selected by the nodes Mode (single/series) configuration property and the type of the Expected Output input (simple/composite type).

Training Node

The Train Model node operates in two different ways depending on whether it is configured to operate synchronously or asynchronously. Synchronous operation is basically a wrapper for the Train function (like the previous node), but asynchronous operation calls a different function and has some additional logic after it to provide a brief execution path on completion of the training operation.

images/diagrams/Diagram_TrainModelNode.png

Running Node

Like the Training Node, the Run node operates in two modes; synchronous and asynchronous with additional logic for async operation.

images/diagrams/Diagram_RunModelNode.png

External Training Set

A wrapper for the External training set function providing node ID context.

images/diagrams/Diagram_ExternalTrainingSetNode.png

External Model

A wrapper for the External Model function providing node ID context.

images/diagrams/Diagram_ExternalModelNode.png