Tech Souls, Connected.

Luma’s New AI Agents Aim to Run Entire Creative Workflows

The startup’s new agent platform aims to automate entire creative workflows—from concept to video production—across text, images, audio, and video.


Luma Introduces End-to-End Creative AI Agents

Luma AI, known for its video-generation technology, has unveiled Luma Agents, a new platform designed to handle complete creative workflows across multiple media formats.

Powered by the company’s new Unified Intelligence models, the agents can plan, generate, and refine creative content spanning:

  • Text
  • Images
  • Video
  • Audio

The system targets ad agencies, marketing teams, design studios, and enterprise brands, positioning itself as an AI collaborator capable of managing complex creative production from start to finish.


A Single Multimodal Intelligence System

At the core of the platform is Uni-1, the first model in Luma’s Unified Intelligence family.

Unlike traditional AI models trained separately for different tasks, Uni-1 uses a single multimodal reasoning architecture.

The system is trained across:

  • Language
  • Images
  • Video
  • Audio
  • Spatial reasoning

According to Luma CEO Amit Jain, the model can interpret instructions in language while generating visual outputs.

“It can think in language and imagine and render in pixels… we call it ‘intelligence in pixels.’”

Future versions will expand capabilities further into audio and video generation.


Coordinating Multiple AI Models

Luma Agents are not limited to the company’s own models.

The platform can coordinate workflows across several AI systems, including:

  • Luma Ray 3.14
  • Google Veo 3 and Nano Banana Pro
  • ByteDance Seedream
  • ElevenLabs voice models

This orchestration approach allows the agent to select the best tools for each stage of a creative project.

Instead of juggling dozens of separate tools, users interact with a single agent that manages the entire pipeline.


Persistent Context for Creative Work

One key feature is the ability to maintain persistent context across projects.

Creative teams often work through dozens of iterations involving different assets, collaborators, and design directions.

Luma Agents track:

  • Project assets
  • Creative revisions
  • Feedback loops
  • Collaborative inputs

The system can then evaluate its own outputs and refine them automatically, creating a continuous improvement cycle.

Jain compares this capability to the self-review loops used in AI coding agents, which check and improve their own results.


Early Enterprise Adoption

Luma has already begun deploying the platform with major clients.

Early adopters include:

  • Publicis Groupe
  • Serviceplan
  • Brands such as Adidas and Mazda
  • Saudi AI company Humain

The company says the agents can significantly accelerate large-scale creative production.

In one example shared by Jain:

  • A 200-word campaign brief and a product image generated multiple ad concepts automatically.

In another case:

  • A $15 million global ad campaign was localized into multiple regional versions
  • The process took 40 hours and cost under $20,000

According to Luma, the generated materials still passed internal brand quality checks.


Moving Beyond Prompt Engineering

Jain argues that current AI workflows are too fragmented.

Creative professionals often need to prompt multiple tools repeatedly, making the process inefficient.

“Here are 100 models. Learn how to prompt them,” he said, describing today’s typical AI workflow.

Luma’s goal is to remove that complexity.

Instead of prompting tools individually, users interact with an agent that generates multiple creative variations automatically and allows teams to steer the direction conversationally.


A Gradual Rollout

Luma Agents are now available through API access, though the company plans a gradual rollout.

The staged launch is intended to ensure system reliability and consistent performance as enterprise teams integrate the technology into production workflows.

If successful, the platform could signal a shift toward AI agents that manage entire creative pipelines, not just individual media generation tasks.


TL;DR
Luma has launched Luma Agents, an AI platform powered by its Unified Intelligence models that can handle full creative workflows across text, images, video, and audio. The system orchestrates multiple AI models and aims to automate marketing and design production for enterprises.

AI Summary

  • Luma launches AI creative agents powered by Uni-1 model.
  • Agents generate and manage text, images, video, and audio.
  • Platform integrates models from Google, ByteDance, and ElevenLabs.
  • Early users include Publicis, Adidas, and Mazda.
  • Goal: automate end-to-end creative production workflows.
Share this article
Shareable URL
Prev Post

Nearly Half of 2025 Zero-Day Attacks Hit Corporate Infrastructure

Next Post

Roblox Uses AI to Rewrite Offensive Messages in Real Time

Read next