Skip to main content

Robotics · 2015

The Godfather of
21st Century Robotics

Meet Alice — the first robot built with Unreal Engine as its brain.

Architecture UE4 + Arduino
Built 2015
Before AirSim 18 months
Developer Solo

In 2015, while the robotics world was still building blind, deterministic machines using ROS and Gazebo, Mac Clark connected an Unreal Engine 4 server to an Arduino and made something the industry had never seen: a robot whose brain lived inside a game engine.

The physical robot had no agency of its own. It listened. The virtual world held the ground truth — calculating movement, checking collisions, running behavior logic — and the hardware simply followed. The physical became an avatar of the digital. That inversion is now called a Digital Twin.

"The physical robot became an avatar of the digital twin — before the industry had a name for it."

The timeline

Mar 2014

Unreal Engine 4 released

Epic opens the engine to the world. No robotics framework exists for it yet.

May 2015

UE4Duino released

The plugin enabling serial communication between UE4 and Arduino. The starting gun.

Aug 2015

Alice is built

Three months after UE4Duino releases, Mac Clark implements UE4 as a live control server driving physical hardware. The Clark Architecture.

Feb 2017

Microsoft releases AirSim

Corporate adoption of UE4 for robotics — validating the approach Clark pioneered 18 months earlier.

Nov 2017

CARLA released

Intel's open-source autonomous driving simulator. The paradigm is now industry standard.

Now

The Madhatter

Alice's brain, evolved. In development.

Joseph Engelberger fathered industrial robotics — the mechanical arm, the assembly line, the 20th century machine. What Mac Clark built in a garage in 2015 is the blueprint for what came after: simulation-first design, photorealistic training environments, and the Sim-to-Real loop that now underpins NVIDIA Isaac, Microsoft AirSim, and the Industrial Metaverse.

He did it alone. He did it first.

What comes next

The Madhatter

In development

Alice proved that a game engine could be a robot's brain. The Madhatter inherits that architecture and pushes it further — pairing Alice's UE5 digital twin with a NVIDIA Jetson Orin Nano Super for real-time edge vision and a Temporal workflow orchestration layer for durable, fault-tolerant physical control.

The key evolution is a two-tier intelligence model. Keyword triggers fire instantly — no waiting on a model to think — while an on-device LLM handles deeper scene reasoning when latency can be traded for understanding. Fast reactions and slow thoughts, working in concert.

Brain

UE5 digital twin

Vision

Jetson + YOLOv8

Orchestration

Temporal

Reasoning

On-device LLM