Get it on Fab.com→

Overview

Prompt: "Cyberpunk city, raining, neon lights, 4k"

**UELTX2: Curated Generation** is a streamlined bridge between **Unreal Engine 5** and **Lightricks' LTX-2** generative video model.

Unlike external tools (Runway, Sora) that require cloud subscriptions and break your pipeline, UELTX2 runs **locally** on your hardware via ComfyUI. It allows you to generate animated placeholders, dynamic textures, and cinematic animatics directly inside the Editor Viewport.

Total Privacy

Your IP never leaves the local network. Perfect for NDA-bound projects.

Native Workflow

Right-click a texture to animate it. Drag-and-drop generation.

Architecture

Understanding how the pieces fit together is crucial for stability.

Unreal Engine 5
UELTX2 Subsystem (C++)
↓ HTTP POST
Payload: Prompt + Image
Local Python Server
ComfyUI (Port 8188)
↓ Inference
Model: LTX-2 (GGUF)

The plugin sends a JSON payload to `localhost:8188`. ComfyUI processes the Diffusion Transformer (DiT) steps, saves an `.mp4` to disk, and UELTX2 auto-imports it as a `FileMediaSource`.

Phase 1: Backend Setup

Before opening Unreal, you must configure the AI server.

1. Install ComfyUI

Download the standalone Windows version to avoid Python dependency issues.

2. Install Required Nodes

UELTX2 relies on specific custom nodes to handle video encoding and quantization.

  • Open ComfyUI Manager (Install it if you haven't).
  • Search for and install: ComfyUI-VideoHelperSuite (Essential).
  • Search for and install: ComfyUI-GGUF (For VRAM optimization).

3. Download Models

Place the LTX-2 weights in your checkpoints folder.

# Recommended: GGUF Quantized (16GB - 24GB VRAM)
C:\ComfyUI\models\checkpoints\LTX-2-dev-Q4_K_M.gguf  (example)

# Optional: Full Precision (24GB+ VRAM)
C:\ComfyUI\models\checkpoints\ltx219BNextGenAIVideo_ltx2FP8VERSION.safetensors  (example)

Phase 2: Plugin Setup

1. Installation

Steps to integrate into your C++ project:


1. Close Unreal Engine.
2. Copy the 'UELTX2' folder to 'YourProject/Plugins/'.
3. Right-click YourProject.uproject -> Generate Visual Studio Files.
4. Build the solution.

2. Configuration

Once compiled, open the Editor and navigate to Project Settings > Game > LTX-2 Generation.

Comfy URL
http://127.0.0.1:8188
Auto Import
True
Default Import Path
/Game/UELTX2_Generations/

Text-to-Video Workflow

The standard generation method for cinematic placeholders.

  1. Start your local ComfyUI server (run run_nvidia_gpu.bat).
  2. In UE5, click the LTX-2 Icon in the main toolbar.
  3. When the panel opens, enter your prompt (e.g., "A spinning golden coin, 4k, loopable").
  4. Click Generate.
  5. Wait for the OnGenerationComplete notification. The file will appear in your Content Browser automatically.

Image-to-Video (Texture Animating)

The "Curated Generation" power-feature. Turn static screenshots into dynamic elements.

Right-Click Action

Simply Running Right-Click > Scripted Actions > Animate with LTX-2 on any texture will create a 2-second video variation of that texture, preserving color and composition.

Mastering the Workflow

A deep dive into the interface, prompt engineering for LTX-2, and converting static assets into motion.

1. The Interface Panel

The UELTX2 Editor Utility Widget is your command center. It is designed to be dockable anywhere in the Unreal Editor layout (we recommend docking it next to the World Outliner).

UELTX2_Panel
Cinematic wide shot, cyberpunk city raining, neon lights, 4k, fluid motion...
Len: 64
System Status Bridge Connected (Port 8188)
Figure 2.1: The main EUW Interface. Simple, efficient, native.

2. Prompt Engineering for LTX-2

LTX-2 is a Diffusion Transformer. Unlike older U-Net models, it understands scene composition better but requires specific "trigger words" to get the best motion.

Weak Prompt
"A car driving."

Result: Often static, blurry, or the car morphs into the road. Lacks temporal context.

Strong Prompt
"Cinematic shot, low angle, joyful movement, a red sports car driving fast on a wet highway, motion blur, 4k, hyperrealistic"

Result: Dynamic camera movement, reflections, and consistent object permanence.

Key Tokens for Motion

  • "Slow pan right" - Control Camera
  • "Zoom in" - Dynamic depth
  • "Fluid motion" - Smooth liquids/smoke
  • "Loopable" - Good for textures

3. Context-Aware Generation (I2V)

The most powerful feature of UELTX2 is generating motion from existing project assets. This runs a different pipeline that preserves the color palette and composition of your source image.

PNG
1. Select Asset Texture2D (Content Browser)
2. Processing Compress & Upload to Comfy
MP4
3. Result MediaSource Asset

Resolution Warning

Aspect Ratio Matters: LTX-2 is trained natively on 768x512 (approx) resolutions. If you input a square 1024x1024 texture, ComfyUI may crop it or stretch it. For best results, ensure your Source Texture is landscape aspect ratio.

4. Advanced Configuration (JSON Hacking)

Want to change the Seed, Steps, or Resolution? You don't need to recompile C++. Tweak the "Brain" of the plugin directly in the content folder.

Navigate to: Plugins/UELTX2/Content/Templates/LTX2_T2V.json

LTX2_T2V.json
{
  "9": {
    "inputs": {
      "seed": 123456789,      <-- Change default seed
      "steps": 30,            <-- Increase for quality (e.g. 50)
      "cfg": 3.0,             <-- Guidance Scale
      "sampler_name": "euler",
      "denoise": 1.0
    },
    ...
  },
  "8": {
    "inputs": {
      "width": 768,           <-- Set custom width
      "height": 512,          <-- Set custom height
      "length": 49            <-- Frame count (approx 2s @ 24fps)
    }
  }
}

Note: Increasing resolution logic exponentially increases VRAM usage. 768x512 uses approx 10GB. 1280x720 requires 24GB+.

"Living Texture" Workflow for I2V

Turn static screenshots into dynamic, loopable materials in one click.

**Image-to-Video (I2V)** is the most powerful feature of UELTX2. It takes an existing texture from your project, encodes it into the Latent Space, and uses LTX-2 to hallucinate motion while preserving the original composition and color palette.

The "Zero-Click" Pipeline: Unlike standard video generation, UELTX2 automatically handles the entire Unreal Engine Media Framework setup for you. It creates the Player, sets it to loop, builds the Material, and starts playback instantly.

Automated Actions
  • Auto-Looping
  • Material Generation
  • Shader Linking

Step-by-Step Guide

1

Select Source Asset

Locate a static texture in your Content Browser. Single-click to select it.
Recommendation: Use Landscape aspect ratios (e.g., 2:1) for best LTX-2 results.

2

Set Source in Panel

In the UELTX2 Panel, click the "Set Source from Selection" button.

UELTX2_Panel
SOURCE ACTIVE
Subtle smoke movement, loopable...
3

The "Zero Click" Result

After generation triggers the OnGenerationComplete event, navigate to your output folder (Default: /Game/UELTX2_Generations/).
You will see a complete Asset Family created for you:

Source.uasset
_Player.uasset
_Tex.uasset
_Mat.uasset
ACTION Simply Drag and Drop the _Mat file onto your mesh.

Pro Tips for LTX-2 I2V

The "Loopable" Token

Always include words like loopable, seamless, infinite in your prompt if this texture is for a background element. LTX-2 will try to match the end frame to the start frame.

Color Preservation

Use the Denoise slider in the JSON config.
0.75 allows motion. 0.40 keeps the exact pixels but just warps them slightly (good for "breathing" characters).

Rapid Pre-Visualization Workflow for T2V

Generate "Instant Animatics" to block out cinematic timing before 3D production begins.

The "Greybox" Problem

Level Designers usually place static mannequins or text actors to represent complex events (e.g., "Building Collapses Here", "Explosion"). This makes it impossible to judge timing, sound design sync, lighting intensity, or camera cuts until weeks later.

The UELTX2 Solution

UELTX2 generates a bespoke video of the event and automatically wraps it into a native Level Sequence Asset. You simply drop this asset into your level, and the event plays on the timeline immediately.

0064 / 0120
Media Track
Media Source
UELTX2_Output_Animatic

# The Workflow

1

Describe the Camera & Action

Open the LTX-2 Panel. In Text-to-Video mode (ensure no texture is selected), describe the shot. Use camera terminology.

"Wide shot, low angle, camera shake, a stone bridge collapsing into a river, dust, debris, physics destruction, cinematic lighting, 4k"
2

Generate Animatic

Click Generate.
The plugin will generate the MP4, import it, create a FileMediaSource, create a Level Sequence, add a Media Track, and bind the source to the track automatically.

3

Drag & Drop Integration

Navigate to your output folder. You will find a distinct asset type:

_Animatic Level Sequence
Final Action:

Drag this _Animatic asset directly into your Level Viewport.
In the Details Panel, verify "Auto Play" is checked. Hit Simulate or Play to watch your AI-generated cutscene happening in 3D space.

"Niagara Source" Workflow for VFX

Generate fluid simulations (Fire, Smoke, Magic) and inject them immediately into Niagara.

The "Simulation Killer"

Normally, creating a photorealistic explosion Loop requires external software like EmberGen or Houdini. With UELTX2, you prompt for the effect, and the plugin automatically generates a Ready-to-Simulate asset family.

// C++ Automation Logic
Material->BlendMode = BLEND_Additive;
Material->ShadingModel = MSM_Unlit;
LinkNode(Texture, ParticleColor, Multiply);

# The Workflow

1

Prompting for Alpha

Use Text-to-Video mode. LTX-2 does not output Alpha channels (transparency), so you must rely on Additive Blending (Black = Transparent). You MUST explicitly instruct the model to create a black background.

Recommended Prompt "Black background, isolated subject, massive fireball explosion, billowing smoke, slow motion, high contrast, 4k"
2

Generate Assets

Click Generate. Navigate to output folder.
Unlike the other workflows, UELTX2 detects the context and creates a special material suffix: _VFXMat.

Auto-Generated Material Graph
Texture Sample
RGB
×
Particle Color
RGB
Material Result
Emissive Color Opacity (Lum)
3

Niagara Assembly

If you have the NS_LTX2_Template stored in your plugin folder, a duplicate _VFX system is created automatically. Otherwise, create a new System:

Module Setting
Sprite Renderer Drag _VFXMat into 'Material' slot.
Set Alignment to Velocity Aligned (optional).
Particle Update Use Color Over Life to tint the fire.
The graph we generated multiplies them essentially.
SubUV / Animation Do not use. This is a Video Texture, not a SubUV Sheet. It plays automatically via the Media Player.
!
Important: Media Player Latency

Video textures have a tiny startup delay. For instant explosions (e.g., Gunshots), LTX-2 videos are slightly too slow. This workflow is best for looping elements (torches, burning wreckage, magic portals).

Troubleshooting & Diagnostics

Solve connection refusals, memory spikes, and missing assets.

1. Quick Diagnostics

Error: Connection Refused

> POST http://127.0.0.1:8188/prompt

> LogUELTX2: Error: Connection Refused. Is ComfyUI running?

Fix: Ensure `run_nvidia_gpu.bat` is running and the port matches Project Settings.

Error: Bad Request (400)

> POST http://127.0.0.1:8188/prompt

> LogUELTX2: Error: 400 Bad Request. Node(s) missing.

Fix: You are missing custom nodes (likely VideoHelperSuite). Check ComfyUI console.

2. VRAM & Out of Memory (OOM)

LTX-2 is a heavy model. If Unreal Engine crashes or ComfyUI reports CUDA Out of Memory, check your configuration against this gauge.

VRAM
Consumer (8GB - 12GB) RTX 3060 / 4070

Requirement: Must use GGUF Q5_k_m quantization. Max resolution 768x512.

Enthusiast (16GB - 24GB) RTX 3090 / 4090

Requirement: Can run bf16 full precision, or high-res GGUF up to 1280x720.

Optimization Checklist

  • Limit Frame Rate: In UE5 Editor Preferences, enable "Use Less CPU when in Background" to free resources for ComfyUI.
  • Close High-VRAM assets: Close Metahuman blueprints or massive levels in UE5 before generating.
  • Use GGUF: Installing the ComfyUI-GGUF node is the #1 way to fix memory crashes.

3. The "Red Node" Problem

If the console logs "Success" but no video appears, drop the template JSON into ComfyUI manually to verify integrity.

Symptoms of Missing Nodes
  • ComfyUI loads the workflow but blocks appear Red.
  • The console says ImportError: failed to load...
Solution: Open ComfyUI Manager > "Install Missing Custom Nodes"

4. Unreal Engine Quirks

Toolbar Button Missing
If the LTX-2 icon is missing from the toolbar:
  1. Check Window > Generative AI in the top menu bar.
  2. Ensure Plugins/UELTX2/Resources/ButtonIcon.png exists.
  3. Check "Output Log" for LogUELTX2Editor errors during startup.
Generated Video is Black / Corrupt
This is usually a codec issue.
  • Ensure ComfyUI-VideoHelperSuite is configured to export h264-mp4 (which we set in JSON).
  • Unreal requires Electra Player or WmfMedia plugins to be enabled to play H.264 inside the editor. Check Plugins > Built-in > Media Players.
"Could not find EUW" Error
The C++ code looks for a strict path: /UELTX2/UI/EUW_UELTX2_Panel.
If you renamed the plugin folder or moved the widget, the toolbar button will fail.
Fix: Ensure "Show Plugin Content" is enabled in Content Browser, and the widget is exactly typically located at Plugins/UELTX2 Content/UI/.
Budapest, HU

GregOrigin

Created by Andras Gregori @ Gregorigin, a single dad currently based outside Budapest, HU.

"Building tools that empower creators to shape worlds."