Introduction - Interactml/iml-unreal GitHub Wiki

About

🔨TODO: Blurb needed, e.g. "InteractML came out of research into interactive machine learning tools at the University of Arts London." but also needs to mention: Wekinator, the Unity version, RapidLib, the University, grants, the team behind it

Machine Learning

Interactive machine learning (IML) is a subfield of artificial intelligence (AI) research where users, generally non-experts, can quickly create and test ML models. These models can learn input/outputs from real-time data through human/computer examples (e.g. when the user moves their arms up-down the character swims upwards).

Find out more on the Machine Learning page 👉

Human Input

Joysticks for videogames? Really? We can do more for our players. Using the power of Interactive Machine Learning, InteractML lets you control what interfaces you use as inputs and what the gestures look and feel like in your game. Let players tell you how they want to play. Set your game's interface free.

We provide a number of ready-made examples using different input devices, ranging from mouse, keyboard, and game controllers, to modern VR systems with motion tracking. You can also pipe values from your own custom devices such as an 🔗Arduino if you wish to. Anything you can send to Unreal or read from outside the engine can be used in InteractML!

Find out more on the Demo Project page 👉

Applications

A machine learning system like this can be used for; spotting situations a player is in (irregular trigger volumes), fuzzy evaluation of game state (is the player struggling?), gesture recognition (mouse/pen driven spell casting), general player facing direction (in VR), performative triggers (motion captured dancer movement), and many more yet to be dreamed of.

Find out more on the Demo Project page 👉

The Principals

The InteractML blueprint nodes allow recording of input parameters and which expected output values they go with, teaching a machine learning system to associate the inputs with the expected outputs, and then running the system to turn input parameters it recognises from training into outputs that can be used to control all sorts of things.

Find out more on the Recording Examples, Training Models, and Running Models pages 👉

The inputs parameters can be from any source such as player position, mouse pointer, VR controller position/orientation, values calculated from sources such as audio or video, or any combination of them collected together. Anything that can later be read and matched by the ML system against the trained examples. For some models these are single snapshots at a point in time, for some a series of snapshots taken over a period of time.

Find out more on the Collecting Parameters page 👉

The output(s) can be a simple number or index used to correlate with the input parameters, or a more complex set of values all associated together. For some models these outputs are discrete and only a given set of values is valid, for some though interpolation is performed and the output is a blend of the available output values.

Find out more on the Labels page 👉

The Software

The InteractML system is a plugin code module that fully integrates into the 🔗Unreal Engine real-time 3D development system. It adds custom asset types for models and training data, custom 🔗Blueprint nodes for recording data, and training/running models, and custom views on some of the data.

Find out more on the Asset Types, Blueprint Nodes, and Training Set pages 👉

It is available free on the 🔗Unreal Marketplace. Add it to your project using the Epic Games Launcher as you would for any other plugin.

Find out more on the Setup page 👉

The Team & Terms

Who is behind InteractML?
What is InteractML made of?
What are its terms of use?
... find out on the Team and Terms page 👉


🏠 [Home]] ](/Interactml/iml-unreal/wiki/[[Team-and-Terms) 👉