Counting 1,868 Big Data & Machine Learning Frameworks, Toolsets, and Examples...
Suggestion? Feedback? Tweet @stkim1

Last Commit
Oct. 18, 2017
Nov. 29, 2016


GitHub release Github All Releases

Unreal Engine plugin for TensorFlow. Enables training and implementing state of the art machine learning algorithms for your unreal projects.

This plugin contains C++, Blueprint and python scripts that encapsulate TensorFlow operations as an Actor Component. It depends on an UnrealEnginePython plugin fork and the SocketIO Client plugin; these are always included in binary releases so no manual external downloading is necessary. See Note on Dependencies section for details on implementation and architecture.

See unreal forum thread for discussions.

Issues and Limitations

There is currently only a working build for the Windows platform.

Linux issue#13 tracking

Android issue#11 tracking

Mac OS issue#10 tracking

If you have ideas or fixes, consider contributing! See for current issues.

Installation & Setup

  1. (GPU only) Install CUDA and cudNN pre-requisites if you're using compatible GPUs (NVIDIA)
  2. Download Latest Release choose CPU or GPU download version if supported.
  3. Create new or choose project.
  4. Browse to your project folder (typically found at Documents/Unreal Project/{Your Project Root})

copy plugins

  1. Copy Plugins folder into your Project root.
  2. (Optional) All plugins should be enabled by default, you can confirm via Edit->Plugins. Scroll down to Project and you should see three plugins, TensorFlow in Computing, Socket.IO Client in Networking and UnrealEnginePython in Scripting Languages. Click Enabled if any is disabled and restart the Editor and open your project again.
  3. Plugin is now ready to use.

Note on Git Cloning

Using full plugin binary releases is recommended, this allows you to follow the installation instructions as written and get up to speed quickly.

If you instead wish to git clone and sync to master repository manually then it is expected that you download the latest python binary dependency release for UnrealEnginePython. This contains an embedded python build; select the BinariesOnly-.7z file from Downloads and drag the plugins folder into your project root. With that step complete, your clone repository should work as expected, all other dependencies will be pulled via pip on first launch.


mnist spawn samples

Basic MNIST softmax classifier trained on begin play with sample training inputs streamed to the editor during training. When fully trained, UTexture2D (1-3) samples are tested for prediction.

An example project is found at

The repository has basic examples for general tensorflow control and different mnist classification examples with UE4 UTexture2D input for prediction. The repository should expand as more plug and play examples are made. Consider contributing samples via pull requests!

It is also the main repository where all development is tracked for all dependencies for this plugin.

Python API

You can either train directly or use a trained model inside UE4.

To start, add your python script file to {Project Root Folder}/Content/Scripts.

wrap your tensorflow python code by subclassing TFPluginAPI.


import tensorflow, unreal_engine and TFPluginAPI in your module file and subclass the TFPluginAPI class with the following functions.

import tensorflow as tf
import unreal_engine as ue
from TFPluginAPI import TFPluginAPI

class ExampleAPI(TFPluginAPI):

	#expected optional api: setup your model for training
	def onSetup(self):
	#expected optional api: parse input object and return a result object, which will be converted to json for UE4
	def onJsonInput(self, jsonInput):
		result = {}
		return result

	#expected optional api: start training your network
	def onBeginTraining(self):
#NOTE: this is a module function, not a class function. Change your CLASSNAME to reflect your class
#required function to get our api
def getApi():
	#return CLASSNAME.getInstance()
	return ExampleAPI.getInstance()

Note the getApi() module function which needs to return a matching instance of your defined class. The rest of the functionality depends on what API you wish to use for your use case. At the moment the plugin supports input/output from UE4 via JSON encoding.

If you wish to train in UE4, implement your logic in onBeginTraining() and ensure you check for self.shouldStop after each batch/epoch to handle early exit requests from the user e.g. when you EndPlay or manually call StopTraining on the tensorflow component. You will also receive an optional onStopTraining callback when the user stops your training session.

If you have a trained model, simply setup your model/load it from disk and omit the training function, and forward your evaluation/input via the onJsonInput(jsonArgs) callback. See example on how to train a network once, and then save the model, reloading it on setup such that you skip retraining it every time.

Note that both onBeginTraining() and onSetup() are called asynchronously by default. If you use a high level library like e.g. keras, may need to store your tf.Session and tf.Graph separately and use it as default with self.session.as_default(): and with self.graph.as_default(): to evaluate, since all calls will be generally done in separate threads.

Below is a very basic example of using tensorflow to add or subtract values passed in as {"a":<float number or array>, "b":<float number or array>}.

import tensorflow as tf
import unreal_engine as ue
from TFPluginAPI import TFPluginAPI

class ExampleAPI(TFPluginAPI):

	#expected optional api: setup your model for training
	def onSetup(self):
		self.sess = tf.InteractiveSession()

		self.a = tf.placeholder(tf.float32)
		self.b = tf.placeholder(tf.float32)

		self.c = self.a + self.b
	#expected optional api: json input as a python object, get a and b values as a feed_dict
	def onJsonInput(self, jsonInput):
		#show our input in the log

		#map our passed values to our input placeholders
		feed_dict = {self.a: jsonInput['a'], self.b: jsonInput['b']}

		#run the calculation and obtain a result
		rawResult =,feed_dict)
		#convert to array and embed the answer as 'c' field in a python object
		return {'c':rawResult.tolist()}

	#custom function to change the operation type
	def changeOperation(self, type):
		if(type == '+'):
			self.c = self.a + self.b

		elif(type == '-'):
			self.c = self.a - self.b

	#expected optional api: We don't do any training in this example
	def onBeginTraining(self):
#NOTE: this is a module function, not a class function. Change your CLASSNAME to reflect your class
#required function to get our api
def getApi():
	#return CLASSNAME.getInstance()
	return ExampleAPI.getInstance()

A full example using mnist can be seen here:

A full example using save/load setup can be seen here:

Another full example using keras api can be found here: Note the keras callback used for stopping training after current batch completes, this cancels training on early gameplay exit e.g. EndPlay.

Asynchronous Events to Tensorflow Component

If you need to stream some data to blueprint e.g. during training you can use the self.callEvent() api.

String Format

The format is self.callEvent('EventName', 'MyString')

Json Format

The format is self.callEvent('EventName', PythonObject, True)

Example use case in where sample training images are emitted to unreal for preview.

Blueprint API

Load your python module from your TensorflowComponent

Once you've written your python module, Select your TensorflowComponent inside your actor blueprint

select component

and change the TensorFlowModule name to reflect your filename without .py. e.g. if my python file is it would look like this

change module name

Optionally disable the verbose python log and change other toggles such as training on BeginPlay or disabling multithreading (not recommended).


By default the onBeginTraining() function will get called on the component's begin play call. You can optionally untick this option and call Begin Training manually.

manual train

Sending Json inputs to your model for e.g. prediction

You control what type of data you forward to your python module and the only limitation for the current api is that it should be JSON formatted.

Basic Json String

In the simplest case you can send e.g. a basic json string {"MyString","SomeValue"} constructed using SIOJson like so

send json string

Any UStruct Example

SIOJson supports completely user defined structs, even ones only defined in blueprint. It's highly recommended to use such structs for a convenient way to organize your data and to reliably decode it on the python side. Below is an example where we send a custom bp struct and encode it straight to JSON.

send custom struct

with the struct defined in blueprint as

custom struct definition

You can also interweave structs, even common unreal types so feel free to mix and match both of the above methods. In this particular example we interweave a 3D vector in a json object we defined. The sent input should now be {"SomeVector":{"x":1.0,"y":2.3,"z":4.3}}

send struct

Special convenience case: UTexture2D

A convenience function wraps a UTexture2D into a json object with {"pixels":[<1D array of pixels>], "size":{"x":<image width>,:"y":<image height>}} which you can reshape using numpy.

send texture

Note that this currently will convert an image into full alpha greyscale. If you need color texture inputs, use own custom method or make a pull request.

Custom functions

If you need to call python functions from blueprint which the current api doesn't support, you can do so by using the CallCustomFunction method on the TensorflowComponent. You specify the function name and pass in a string as arguments. The function runs on the game thread and will return immediately with an expected string value. For both arguments and returning values, JSON encoding is recommended, but optional.

custom function call

Example custom function call passing in a string argument to changeOperation in

Handling Tensorflow Events

Select your Tensorflow Component from your actor blueprint and then click + to subscribe to the chosen event in the event graph.


current api supports the following events

On Input Results

Called when onJsonInput() completes in your python module. The returned data is a json string of the return data you pass at the end of the function.


Normally you'd want to convert this string into SIOJsonObject so you can use your results data in blueprint. It is also typical to have a prediction field attached to this object for e.g. classification tasks.

If you have a regular return format, consider making your own custom bp struct and fill its value from the json string like this

fill struct from json

Note that the function will only fill fields that have matching names and ignore all other struct fields. This means you can safely fill a partial struct from a json string that has more fields than the struct defines.

On Training Complete

When the onBeginTraining() call is complete you receive this event with {'elapsed':<time taken>} json, optionally with additional return data passed in from your function.


On Event

If you use self.callEvent() you will receive this event dispatch. You can filter your event types by the event name and then do whatever you need to with the data passed in.


For example uses self.callEvent() to async stream training images and we'd filter this via checking for 'PixelEvent'

Blueprint Utilities


A large portion of the plugin capability comes from its ability to convert data types. See TensorflowBlueprintLibrary.h for full declarations and code comments.

UTexture2D to float array (grayscale)

Convert a UTexture2D as grayscale to a 1D float array; obtains size from texture.


ToGrayScaleFloatArray (Texture2D)


static TArray<float> Conv_GreyScaleTexture2DToFloatArray(UTexture2D* InTexture);

UTexture2D to float array

Convert a UTexture2D to a 1D float array; obtains size from texture. Expects 4 1-byte values per pixel e.g. RGBA.


ToFloatArray (Texture2D)


static TArray<float> Conv_Texture2DToFloatArray(UTexture2D* InTexture);

Invert Float Array

Invert values in a given float array (1->0, 0->1) on a 0-1 scale.




static TArray<float> InvertFloatArray(const TArray<float>& InFloatArray);

Float array to UTexture2D

Convert a 4 value per pixel float array to a UTexture2D with specified size, if size is unknown (0,0), it will assume a square array.


ToTexture2D (Float Array)


static UTexture2D* Conv_FloatArrayToTexture2D(const TArray<float>& InFloatArray, const FVector2D Size = FVector2D(0,0));

Float array (Grayscale) to UTexture2D

Convert a 1 value per pixel float array to a UTexture2D with specified size, if size is unknown (0,0), it will assume a square array.


ToTexture2D (Grayscale Array)


static UTexture2D* Conv_FloatArrayToTexture2D(const TArray<float>& InFloatArray, const FVector2D Size = FVector2D(0,0));

ToTexture2D (Render Target 2D)

Convert a UTextureRenderTarget2D to a UTexture2D


ToTexture2D (Render Target 2D)


static UTexture2D* Conv_RenderTargetTextureToTexture2D(UTextureRenderTarget2D* InTexture);

ToFloatArray (bytes)

Convert a byte array into a float array, normalized by the passed in scale


ToFloatArray (bytes)


static TArray<float> Conv_ByteToFloatArray(const TArray<uint8>& InByteArray, float Scale = 1.f);

TF Audio Capture Component

A c++ component that uses windows api to capture and stream microphone audio without the need of an online subsystem. See for details on API.

This component is aimed to be used for native speech recognition when Tensorflow examples mature.

File Utility Component

A simple blueprint wrapper to save and load bytes from file. Allows to easily flush e.g. audio capture for later use. See for details on API.

Note on Dependencies

Depends on an UnrealEnginePython plugin fork and the SocketIO Client plugin. Both of these and an embedded python build are included in every release so you don't need to manually include anything, just drag and drop the Plugins folder into your project from any release.

Architecture and Purpose



Based on the wonderful work by 20tab, the UnrealEnginePython plugin fork contains changes to enable multi-threading, python script plugin encapsulation and automatic dependency resolution via pip. Simply specifying tensorflow as a pythonModule dependency in makes the editor auto-resolve the dependency on first run. The multi-threading support contains a callback system allowing long duration operations to happen on a background thread (e.g. training) and then receiving callbacks on your game-thread. This enables TensorFlow to work without noticeably impacting the game thread.

SocketIO Client

SocketIO Client is used for easy conversion between native engine types (BP or C++ structs and variables) and python objects via JSON. Can optionally be used to connect to a real-time web service via

Troubleshooting / Help

No module named 'tensorflow'

On first run you may see this message in your python console

no tensorflow

Wait until pip installs your dependencies fully, this may take ~3-5min. When the dependencies have installed, it should look something like this


After you see this, go ahead and close your editor and re-launch the project. When the project has launched again this error should not show up again.

2-3 sec hitch on first begin play

This is due to python importing tensorflow on begin play and loading all the dlls. Currently unavoidable, only happens once per editor launch.

Issue not listed?

Post your issue to


Plugin - MIT

TensorFlow and TensorFlow Icon - Apache 2.0

Latest Releases
v0.4.1 for UE4.17
 Aug. 27 2017
v0.4.0 for UE4.17
 Aug. 24 2017
v0.3.0 for UE4.17
 Aug. 17 2017
v0.2.1 for UE4.16
 Jun. 6 2017
v0.2.0 for UE4.15
 Jun. 3 2017