Get it on Fab.com→

Overview

Prompt: "Cyberpunk city, raining, neon lights, 4k"

**UELTX2: Curated Generation** is a streamlined bridge between **Unreal Engine 5** and **Lightricks' LTX-2** generative video model.

Unlike external tools (Runway, Sora) that require cloud subscriptions and break your pipeline, UELTX2 runs **locally** on your hardware via ComfyUI. It allows you to generate animated placeholders, dynamic textures, and cinematic animatics directly inside the Editor Viewport.

Total Privacy

Your IP never leaves the local network. Perfect for NDA-bound projects.

Native Workflow

Right-click a texture to animate it. Drag-and-drop generation.

Architecture

Understanding how the pieces fit together is crucial for stability.

Unreal Engine 5
UELTX2 Subsystem (C++)
↓ HTTP POST
Payload: Prompt + Image
Local Python Server
ComfyUI
↓ Inference
Model: LTX-2 (GGUF)

The plugin sends a JSON payload to your configured ComfyURL (default: http://127.0.0.1:8188). ComfyUI processes the Diffusion Transformer (DiT) steps, saves an output file to disk, and UELTX2 auto-imports it as a FileMediaSource.

Architecture Diagram
Figure 1.1: The breakdown of the http request/response cycle.

Phase 1: Backend Setup

UELTX2 currently supports ComfyUI only. SwarmUI support has been removed in this build to ensure reliability and consistent output handling.

1. Install ComfyUI (Recommended)

Download the standalone Windows version to avoid Python dependency issues.

2. Node Requirements

UELTX2’s default workflows run on core ComfyUI nodes (no custom nodes required). If you plan to use GGUF models, install the GGUF node package.

  • Optional: install ComfyUI-GGUF for GGUF quantized models.
ComfyUI Manager
Use ComfyUI Manager to install optional packages (e.g., GGUF).

3. Auto-Setup (Recommended)

In UE, open the UELTX2 panel and use ComfyUI Setup → Auto-Setup to configure extra model paths. Use Auto-Setup + Downloads to download the LTX-2 model and the required text encoder. Restart ComfyUI after it finishes.

4. Download Models

Place the LTX-2 weights in your checkpoints folder if you did not use auto-download.

# Recommended: GGUF Quantized (16GB - 24GB VRAM)
C:\ComfyUI\models\checkpoints\LTX-2-dev-Q4_K_M.gguf  (example)

# Optional: Full Precision (24GB+ VRAM)
C:\ComfyUI\models\checkpoints\ltx219BNextGenAIVideo_ltx2FP8VERSION.safetensors  (example)
Folder Structure
Ensure the weights are in the standard Checkpoints folder.

5. Confirm Output Folder

UELTX2 can fall back to reading ComfyUI's local output directory if HTTP downloads fail. Make sure Output Directory (Fallback) in Project Settings points to your ComfyUI output folder.

Phase 2: Plugin Setup

1. Installation

Steps to integrate into your C++ project:


1. Close Unreal Engine.
2. Copy the 'UELTX2' folder to 'YourProject/Plugins/'.
3. Right-click YourProject.uproject -> Generate Visual Studio Files.
4. Build the solution.

2. Configuration

Once compiled, open the Editor and navigate to Project Settings > Game > UELTX2 Generation. The plugin now exposes granular control over the generation pipeline.

Connection & Backend

Backend Type
ComfyUI
Comfy URL
http://127.0.0.1:8188
Output Directory
C:\ComfyUI\output\
Polling Interval
1.0 sec
Allow Remote URLs
False (localhost only)
Job Timeout
300 sec
Max Retries
1
Prompt Sanitization
Enabled
Max Prompt Length
2000
Validate Output
Enabled (mp4/webm signature check)

Model Management

Model Directories
(Array) Paths to scan for .gguf / .safetensors
Selected Model
LTX-2-dev-Q4_K_M.gguf

Generation Defaults

Resolution
768 x 512
Frames / FPS
49 frames @ 24 fps
Sampling
Steps: 20 | CFG: 7.0 | Euler
Denoise (I2V)
1.0

Presets & UI State

Preset Profiles
Storyboard 512 | Preview 720p | Final 1024
Use Last Params
Enabled
Last Used Params
Stored per-project

Asset Automation

Create Material
Create Sequence
Create VFX (Niagara)
Persistent Media Copy
Content/Movies/UELTX2
Delete Source After Import
Enabled (if copy succeeds)

Text-to-Video Workflow

The standard generation method for cinematic placeholders.

  1. Start your local ComfyUI server (run run_nvidia_gpu.bat).
  2. In UE5, click the LTX-2 Icon in the main toolbar. Toolbar Icon
  3. When the panel opens, enter your prompt (e.g., "A spinning golden coin, 4k, loopable").
  4. Click Generate.
  5. Wait for the OnGenerationComplete notification. The file will appear in your Content Browser automatically. Content Browser

Image-to-Video (Texture Animating)

The "Curated Generation" power-feature. Turn static screenshots into dynamic elements.

Right-Click Action

Simply Running Right-Click > Scripted Actions > Animate with LTX-2 on any texture will create a 2-second video variation of that texture, preserving color and composition.

Context Menu
The "Zero-Configuration" workflow for textures.

Mastering the Workflow

A deep dive into the interface, prompt engineering for LTX-2, and converting static assets into motion.

1. The Interface Panel

The UELTX2 Panel is a native Slate window (no Blueprint UI dependencies). It is designed to be dockable anywhere in the Unreal Editor layout (we recommend docking it next to the World Outliner).

UELTX2_Panel
Cinematic wide shot, cyberpunk city raining, neon lights, 4k, fluid motion...
Len: 64
System Status Bridge Connected (Port 8188)
Figure 2.1: The main UELTX2 Panel. Simple, efficient, native.

2. Health & Connection Monitoring

UELTX2 now exposes a Health section that reports HTTP status, WebSocket status, polling fallback, output directory, and the last backend error. This makes it obvious whether you are connected, running on polling fallback, or fully real-time.

  • HTTP OK confirms the REST API is reachable.
  • WS Connected enables real-time progress updates.
  • Polling On indicates WebSockets were unavailable; the plugin is still retrieving outputs.
  • Retries shows remaining automatic retries if a job stalls.

If a job stalls beyond JobTimeoutSeconds, UELTX2 will retry automatically up to JobMaxRetries. Queue position is displayed in the Progress area when multiple jobs are waiting.

3. Presets & Last-Used Parameters

Presets let you apply a curated configuration instantly (resolution, frames, steps, sampler, etc.). UELTX2 also persists your last-used parameters so reopening the panel restores the exact state you left it in.

Default Presets: Storyboard 512, Preview 720p, Final 1024. Customize or add your own in Project Settings > Game > UELTX2 Generation.

4. Model Scanning & Auto-Selection

Model directories are scanned for .gguf and .safetensors. The first LTX model found is auto-selected, and your selection is persisted in settings. Model folders are also watched in the editor, so adding or removing a model updates the list without restarting.

If you mix GGUF and SafeTensors, UELTX2 selects the appropriate template and warns if a template/model mismatch is detected.

Image-only checkpoints (e.g., SDXL 1.0) are supported. UELTX2 will generate a single image and repeat it into an MP4 (static video). For real motion, use an LTX video model.

Motion induction for image models is enabled by default via randomized latent batches. For subtler motion, lower Denoise (e.g., 0.2–0.5).

5. Prompt Engineering for LTX-2

LTX-2 is a Diffusion Transformer. Unlike older U-Net models, it understands scene composition better but requires specific "trigger words" to get the best motion.

Prompts are sanitized automatically (control characters removed, whitespace trimmed). If enabled in settings, prompts are also clamped to a maximum length for stability.

Weak Prompt
"A car driving."

Result: Often static, blurry, or the car morphs into the road. Lacks temporal context.

Strong Prompt
"Cinematic shot, low angle, joyful movement, a red sports car driving fast on a wet highway, motion blur, 4k, hyperrealistic"

Result: Dynamic camera movement, reflections, and consistent object permanence.

Key Tokens for Motion

  • "Slow pan right" - Control Camera
  • "Zoom in" - Dynamic depth
  • "Fluid motion" - Smooth liquids/smoke
  • "Loopable" - Good for textures

6. Context-Aware Generation (I2V)

The most powerful feature of UELTX2 is generating motion from existing project assets. This runs a different pipeline that preserves the color palette and composition of your source image.

PNG
1. Select Asset Texture2D (Content Browser)
2. Processing Compress & Upload to Comfy
MP4
3. Result MediaSource Asset

Resolution Warning

Aspect Ratio Matters: LTX-2 is trained natively on 768x512 (approx) resolutions. If you input a square 1024x1024 texture, ComfyUI may crop it or stretch it. For best results, ensure your Source Texture is landscape aspect ratio.

7. Output, Import, and Persistent Media

Generated videos are imported automatically as FileMediaSource assets (if auto-import is enabled). UELTX2 also copies the output to a persistent location at Content/Movies/UELTX2 so your media source remains valid even if the original ComfyUI output gets deleted.

  • Open Output Folder jumps to the last output on disk.
  • Open Imported Asset highlights the most recent asset in the Content Browser.
  • Output Validation checks the file signature (mp4/webm) before import to prevent corrupt media issues.

8. Custom Workflows (Advanced)

While Project Settings now handle resolution, steps, and model selection, you can still fully customize the underlying ComfyUI node graph.

The plugin uses JSON templates located in Plugins/UELTX2/Content/Templates/. You can create your own (e.g., adding LoRA nodes, Upscalers, or ControlNets) and link them in the Project Settings under "Workflows".

LTX2_T2V.json
{
  "9": {
    "class_type": "KSampler",
    "inputs": {
      "seed": {{SEED}},           <-- Injected by Plugin
      "steps": {{STEPS}},         <-- Injected by Plugin
      "cfg": {{CFG}},             <-- Injected by Plugin
      "sampler_name": "{{SAMPLER}}"
    }
  },
  "1": {
    "class_type": "CheckpointLoaderSimple",
    "inputs": {
      "ckpt_name": "{{MODEL_NAME}}" 
    }
  }
}

Supported Placeholders: {{WIDTH}}, {{HEIGHT}}, {{FPS}}, {{FRAMES}}, {{SEED}}, {{STEPS}}, {{CFG}}, {{MODEL_NAME}}, {{MODEL_PATH}}, {{MODEL}}, {{PROMPT}}, {{NEGATIVE_PROMPT}}, {{DENOISE}}, {{SOURCE_IMAGE}}, {{CLIP_NAME}}, {{VAE_NAME}}.

You can supply separate templates for T2V and I2V via Custom Workflow Template and Custom I2V Workflow Template. Built-in templates are cached asynchronously for faster submission.

"Living Texture" Workflow for I2V

Turn static screenshots into dynamic, loopable materials in one click.

**Image-to-Video (I2V)** is the most powerful feature of UELTX2. It takes an existing texture from your project, encodes it into the Latent Space, and uses LTX-2 to hallucinate motion while preserving the original composition and color palette.

The "Zero-Click" Pipeline: Unlike standard video generation, UELTX2 automatically handles the entire Unreal Engine Media Framework setup for you. It creates the Player, sets it to loop, builds the Material, and starts playback instantly.

Automated Actions
  • Auto-Looping
  • Material Generation
  • Shader Linking

Step-by-Step Guide

1

Select Source Asset

Locate a static texture in your Content Browser. Single-click to select it.
Recommendation: Use Landscape aspect ratios (e.g., 2:1) for best LTX-2 results.

2

Set Source in Panel

In the UELTX2 Panel, click the "Set Source from Selection" button.

UELTX2_Panel
SOURCE ACTIVE
Subtle smoke movement, loopable...
3

The "Zero Click" Result

After generation triggers the OnGenerationComplete event, navigate to your output folder (Default: /Game/UELTX2_Generations/).
You will see a complete Asset Family created for you:

Source.uasset
_Player.uasset
_Tex.uasset
_Mat.uasset
ACTION Simply Drag and Drop the _Mat file onto your mesh.

Pro Tips for LTX-2 I2V

The "Loopable" Token

Always include words like loopable, seamless, infinite in your prompt if this texture is for a background element. LTX-2 will try to match the end frame to the start frame.

Color Preservation

Use the Denoise slider in the JSON config.
0.75 allows motion. 0.40 keeps the exact pixels but just warps them slightly (good for "breathing" characters).

Rapid Pre-Visualization Workflow for T2V

Generate "Instant Animatics" to block out cinematic timing before 3D production begins.

The "Greybox" Problem

Level Designers usually place static mannequins or text actors to represent complex events (e.g., "Building Collapses Here", "Explosion"). This makes it impossible to judge timing, sound design sync, lighting intensity, or camera cuts until weeks later.

The UELTX2 Solution

UELTX2 generates a bespoke video of the event and automatically wraps it into a native Level Sequence Asset. You simply drop this asset into your level, and the event plays on the timeline immediately.

0064 / 0120
Media Track
Media Source
UELTX2_Output_Animatic

# The Workflow

1

Describe the Camera & Action

Open the LTX-2 Panel. In Text-to-Video mode (ensure no texture is selected), describe the shot. Use camera terminology.

"Wide shot, low angle, camera shake, a stone bridge collapsing into a river, dust, debris, physics destruction, cinematic lighting, 4k"
2

Generate Animatic

Click Generate.
The plugin will generate the MP4, import it, create a FileMediaSource, create a Level Sequence, add a Media Track, and bind the source to the track automatically.

3

Drag & Drop Integration

Navigate to your output folder. You will find a distinct asset type:

_Animatic Level Sequence
Final Action:

Drag this _Animatic asset directly into your Level Viewport.
In the Details Panel, verify "Auto Play" is checked. Hit Simulate or Play to watch your AI-generated cutscene happening in 3D space.

"Niagara Source" Workflow for VFX

Generate fluid simulations (Fire, Smoke, Magic) and inject them immediately into Niagara.

UELTX2 now auto-creates a Niagara template system on editor startup: /UELTX2/Templates/NS_LTX2_Template (with an emitter at /UELTX2/Templates/NE_LTX2_TemplateEmitter). When VFX creation is enabled, the plugin duplicates this template and swaps in your generated media material automatically.

The "Simulation Killer"

Normally, creating a photorealistic explosion Loop requires external software like EmberGen or Houdini. With UELTX2, you prompt for the effect, and the plugin automatically generates a Ready-to-Simulate asset family.

// C++ Automation Logic
Material->BlendMode = BLEND_Additive;
Material->ShadingModel = MSM_Unlit;
LinkNode(Texture, ParticleColor, Multiply);

# The Workflow

1

Prompting for Alpha

Use Text-to-Video mode. LTX-2 does not output Alpha channels (transparency), so you must rely on Additive Blending (Black = Transparent). You MUST explicitly instruct the model to create a black background.

Recommended Prompt "Black background, isolated subject, massive fireball explosion, billowing smoke, slow motion, high contrast, 4k"
2

Generate Assets

Click Generate. Navigate to output folder.
Unlike the other workflows, UELTX2 detects the context and creates a special material suffix: _VFXMat.

Auto-Generated Material Graph
Texture Sample
RGB
×
Particle Color
RGB
Material Result
Emissive Color Opacity (Lum)
3

Niagara Assembly

The plugin auto-creates NS_LTX2_Template on startup. When VFX is enabled, it duplicates that template into a _VFX system and assigns your media-driven material. If the template is missing or corrupted, recreate it by restarting the editor.

Module Setting
Sprite Renderer Drag _VFXMat into 'Material' slot.
Set Alignment to Velocity Aligned (optional).
Particle Update Use Color Over Life to tint the fire.
The graph we generated multiplies them essentially.
SubUV / Animation Do not use. This is a Video Texture, not a SubUV Sheet. It plays automatically via the Media Player.
Niagara Result
The final result running inside the Viewport.
!
Important: Media Player Latency

Video textures have a tiny startup delay. For instant explosions (e.g., Gunshots), LTX-2 videos are slightly too slow. This workflow is best for looping elements (torches, burning wreckage, magic portals).

Troubleshooting & Diagnostics

Solve connection refusals, memory spikes, and missing assets.

1. Quick Diagnostics

Error: Connection Refused

> POST http://127.0.0.1:8188/prompt

> LogUELTX2: Error: Connection Refused. Is ComfyUI running?

Fix: Ensure `run_nvidia_gpu.bat` is running, the port matches Project Settings, and the ComfyURL is correct.

Error: Bad Request (400)

> POST http://127.0.0.1:8188/prompt

> LogUELTX2: Error: 400 Bad Request. Node(s) missing.

Fix: Update ComfyUI and ensure required nodes exist. If you are using GGUF models, install ComfyUI-GGUF. Check the ComfyUI console for missing node names.

2. VRAM & Out of Memory (OOM)

LTX-2 is a heavy model. If Unreal Engine crashes or ComfyUI reports CUDA Out of Memory, check your configuration against this gauge.

VRAM
Consumer (8GB - 12GB) RTX 3060 / 4070

Requirement: Must use GGUF Q5_k_m quantization. Max resolution 768x512.

Enthusiast (16GB - 24GB) RTX 3090 / 4090

Requirement: Can run bf16 full precision, or high-res GGUF up to 1280x720.

Optimization Checklist

  • Limit Frame Rate: In UE5 Editor Preferences, enable "Use Less CPU when in Background" to free resources for ComfyUI.
  • Close High-VRAM assets: Close Metahuman blueprints or massive levels in UE5 before generating.
  • Use GGUF: Installing the ComfyUI-GGUF node is the #1 way to fix memory crashes.

3. The "Red Node" Problem

If the console logs "Success" but no video appears, drop the template JSON into ComfyUI manually to verify integrity.

Symptoms of Missing Nodes
  • ComfyUI loads the workflow but blocks appear Red.
  • The console says ImportError: failed to load...
Solution: Open ComfyUI Manager > "Install Missing Custom Nodes"

4. Unreal Engine Quirks

Toolbar Button Missing
If the LTX-2 icon is missing from the toolbar:
  1. Check Window > Generative AI in the top menu bar.
  2. Ensure Plugins/UELTX2/Resources/ButtonIcon.png exists.
  3. Check "Output Log" for LogUELTX2Editor errors during startup.
Generated Video is Black / Corrupt
This is usually a codec issue.
  • Ensure the ComfyUI SaveVideo node outputs h264-mp4 (default in the built-in templates).
  • Unreal requires Electra Player or WmfMedia plugins to be enabled to play H.264 inside the editor. Check Plugins > Built-in > Media Players.
Remote URL Blocked
UELTX2 only allows localhost by default. Fix: Enable Allow Remote Connections in Project Settings if you are connecting to another machine.
No Output / Stuck Generating
Check the Health panel:
  • If WS is disconnected, polling fallback should be On.
  • If the job stalls past JobTimeoutSeconds, it will retry automatically.
  • Use Reinitialize Backend if ComfyUI was restarted.
Imported Asset is an Image
This means the workflow returned a single frame (PNG/JPEG) instead of a video. Verify your ComfyUI workflow includes video output nodes (CreateVideo + SaveVideo) and that the model supports video generation.
Media Source Path Breaks
UELTX2 copies outputs to Content/Movies/UELTX2 to keep FileMediaSource stable. If a media source still breaks, ensure the persistent file exists on disk and that the project has write access to the Content folder.
Panel Won't Open
The UELTX2 panel is a Slate tab. If it doesn't open:
  1. Use Window > Generative AI > Open LTX-2 Panel.
  2. Check Output Log for LogUELTX2Editor errors during startup.
  3. Rebuild the project to regenerate plugin binaries.
Budapest, HU

GregOrigin

Created by Andras Gregori @ Gregorigin, a single dad currently based outside Budapest, HU.

"Building tools that empower creators to shape worlds."