AI

What Is Post-Permission Cinema? (And Why We Are Done Waiting for Green Lights)

A term, a movement, and a manifesto for AI filmmaking by Andrew “Oyl” Miller.

Whether you’ve spent 15 minutes or 15 years in the trenches of advertising, you know the exact weight of a 100-page deck. You know the sound of a client’s legal team slowly killing the best idea in the room and filing down the edge just to be safe. You know the calendar math: concept approved in Q1, cameras rolling in Q3, spot airing by Q4, assuming nothing dies in the room, or any corporate restructuring happens between now and then.

Our entire industry was architected around one load-bearing bottleneck: Permission.

Permission to access capital. Permission to hire a crew. Permission from a CMO, a legal team, a regional brand manager, and a nervous account director to turn a script into something that actually lives in the world.

I am here to tell you Permission is Over (If You Want It).

Welcome to Post-Permission Cinema.

Defining Post-Permission Cinema

Post-Permission Cinema is the filmmaking reality we now live in. A moment in creative history where generative AI has completely vaporized the friction between conceptualization and execution.

It is the ability for a single creator, or a small team moving in a singular fashion, to write, direct, and ship broadcast-quality narrative work without requiring traditional gatekeepers, institutional funding, or the logistical weight of a traditional production.

It’s scary and raw. There are no fall-backs. Accountability is out in the open.

In the Post-Permission Cinema era, the barrier to entry is no longer budget. It is no longer accessible. It is no longer who you know inside a network, studio, or holding company.

The only barrier left is vision, taste, and the willingness to execute.

I coined this term because the industry needed one. We have been describing pieces of this shift in word tags and technical jargon: “AI filmmaking,” “synthetic media,” “generative content.” But we haven’t yet zoomed out and named it in the larger historical rupture those pieces add up to. Post-Permission Cinema is that rupture. It is a named era, the way we name the French New Wave or the birth of digital non-linear editing. Something permanent has changed. The least we can do is call it what it is.

The Four Tenets of Post-Permission Cinema

Operating in this new reality requires a complete rewiring of how we think about creative production. These are the core tenets:

1. Execution Kills the Pitch Deck

Nobody wants to scan another generic mood board with the same tired references that have been passed down from Tumblr to Pinterest to TikTok, and smuggled away in art directors’ messy desktops. When the tools exist to generate high-fidelity cinematic renders in minutes, when you can go from a script idea to a locked cut in a weekend, writing a deck about what a film might look like is a waste of everyone’s time, including yours.

In Post-Permission Cinema, the pitch is the pilot. You no longer ask a client or an executive to imagine the final product. You drop the final product in an email. The new starting point is the rough-cut. Pixar style. Let’s judge the details right after we have the script. We are flattening time and exposing taste, or lack thereof.

I’ve spent years seeing teams put together the best pitch decks and treatments in the business. Beautifully typeset. Full reference imagery. High taste references and GIFs. Careful strategic framing. I’m proud of that craft. And I am telling you, as someone who lived inside that system, that those decks were always a substitute for the thing we actually wanted to make. Now we can just make the thing.

Execution is the only currency that matters now. No more hiding behind borrowed references.

2. Taste Is Your Only Moat

When anyone can prompt a photorealistic scene using Luma Dream Machine, Veo, Runway, or Midjourney, and when the render button is universally accessible, the technology itself stops being a competitive advantage. The question of craft becomes, what will you put into these dream machines? Sloppy input leads to sloppy output. Just look at what our feeds have become.

What separates a filmmaker from someone mashing “generate” is everything that comes before the prompt and everything that happens after the output: the conceptual architecture, the instinct for what to cut, the understanding of pacing and tone, and why a certain piece of music makes a sequence land differently. And most importantly, when what you’ve generated doesn’t hit the mark.

This is where years of agency rigor and rendering harsh judgment in the name of raising the creative bar pay dividends in ways no one expected. I’ve spent two decades learning how to build narrative tension inside a 60-second broadcast window. Learning how lighting, camera movement, set design, wardrobe, casting, and all the individual disciplines add up to elevate a piece of work. Learning how a single editorial choice can be the difference between a film people share and a film people scroll past.

None of that experience became obsolete when the tools arrived. It became the whole game. Your eye plus your judgement will define you.

3. The Pipeline Is Flat

Traditional production is a strictly linear march: Pre-Production → Production → Post-Production. Each phase is a separate kingdom with its own vendors, timelines, and handoffs. The whole apparatus was designed to manage the complexity of physical production, which required sequences and dependencies almost by necessity.

Post-Permission Cinema compresses all of that into a singular, overlapping workflow. Scripting, storyboarding, scoring, and cutting happen simultaneously and recursively. I’ll be editing a sequence, and it will suggest a narrative direction I hadn’t considered, which sends me back to the script, which changes the shot list before I’ve even “generated” the shots.

The director is no longer managing a logistical army. They are conducting a symphony of algorithms, agents, and enhanced capabilities, and more importantly, they are making artistic decisions at every stage that used to get diluted across a dozen departments and handoffs.

4. The Audience Is the Only Approver

We used to let ideas die quietly in conference rooms. A concept would get as far as a deck, hit a budget wall or an anxious client, and simply disappear, un-made and un-seen, as if it never existed.

Today you build the work. Then you push it directly into the cultural slipstream and let the timeline decide. Creators are already operating this way. Brands and studios will be next. If they’re smart.

The internet is a ruthless and honest editor. If the work is undeniable, it moves. If it doesn’t move, the problem is yours to solve, not a committee’s, not a budget holder’s. That accountability is clarifying in a way that the permission economy never was.

This Isn’t Theory. It’s Already on the Air.

I want to be precise about this because the discourse around AI filmmaking still tends to exist at the level of demos and proofs-of-concept, tech presentations, and Discord threads. What I’m describing has already crossed into the broadcast world.

In early 2026, I directed one of the first fully AI-generated commercials to clear network standards and air nationally. A 30-second spot for Proofpoint, produced with ONLYCH1LD, and broadcast on ESPN. I applied these lessons immediately to creating two spots, Luma Taxi and Luma Suds, for the Luma Dream Brief that were selected by an elite creative industry jury, run as real ads, and officially submitted to Cannes Lions. Once again, I am taking these lessons and momentum and putting them right back into practice. No bending the knee. No asking nicely. More firsts are coming.

The tools are ready. The networks are accepting the work. The festivals are beginning to evaluate it on the same stage as everything else. We are in full transition now. There are sea changes happening under the surface that will start manifesting in radically diverse ways.

The era of Post-Permission Cinema is not approaching. It has arrived and now surrounds us.

Why Naming This Matters

Every wave of innovation is first dominated by technical conversations. It took storytelling breakthroughs like Jurassic Park and Pixar’s string of early hits to transform technical achievements into a larger, more human, and compelling conversation.

Right now, AI filmmaking is dominated by tech bros flexing workflows and render hacks. Where are the writers? Where are the storytellers? Where are those trying to go beyond the multitude of Elon Musk remixes and cat memes?

I jumped into these tools early, and I want to start naming what’s going on. I want to start having a more nuanced discussion and move beyond all the triggered trolls in my DMs, copy and pasting the same stale arguments. When you stand up first in an area, you can’t help but provoke debate. But would you rather have the machines go fully autonomous, or have some humans in there, wrestling with the tools and fighting to push beyond the edges? What if creatives took over the tools and conversation from the platform companies, venture-funded labs, and trade press, trying to figure out how to cover it? Creators are getting drowned out and marginalized now. We have to punch back.

Post-Permission Cinema is the term for what we are all living through. It describes the power shift. It describes the pipeline change. And it describes the creative obligation that comes with it: there are no more excuses.

The interesting thing is how it is triggering the top and bottom of the pecking order. The old guard and traditional gatekeepers are shaking, as are the anonymous YouTube commenters and trolls. I’m swimming somewhere in the messy middle.

I’m for the independent artists. For poets and writers and makers trying to make themselves heard. I believe it’s a privilege not to use every tool at your disposal. Does Steven Spielberg have any use for emerging tools? Probably not, unless they serve his vision. However, a filmmaker with 300 subscribers on YouTube is in a different boat. The world is not paying attention yet. And I believe the hungry artists with something to say will do whatever it takes to give their ideas shape.

These tools are for anyone who has ever felt blocked.

For anyone who has been denied access. For those who have not been able to raise funds to make their ideas come to life.

Why spend any more of your career idling at red lights that never turn?

The roads are open now.

Where we’re going, you don’t even need roads. You don’t need a green light. You don’t need a studio. You don’t need a network pickup or an agency brief or a client budget approved in Q1.

You need a vision, a set of tools that are already in your hands, and the discipline to stop waiting for someone to tell you it’s okay.

That permission was never going to come anyway. So stop asking and just make like you’ve never made before.

And as we’re seeing early on, traditional gatekeepers are paying attention and trying to figure out what’s going on. Conversations are happening, and models are shifting.

You can’t control that. But you can control what you make.

So what are you waiting for?

Andrew “Oyl” Miller is an advertising Creative Director, Copywriter, and AI Film Director. He spent 15 years working at Wieden+Kennedy on brands like Nike, PlayStation, MLB, Amazon, and IKEA—and is now one of the first people to direct a fully AI-generated commercial for broadcast television. You can follow his insights and updates on his newsletter.

From Laptop to Cannes: Making "Luma Taxi" and "Luma Suds"

How a hybrid generative-AI workflow and the Luma Dream Brief turned two solo, cinematic experiments into official Cannes Lions entries.

I made two commercials completely by myself.

And now, thanks to an opportunity from the Luma Dream Brief, they are both headed to Cannes Lions as official entries.

The first spot, Luma Taxi, was born from an idea I’ve had in my head for ages: drop futuristic tech into the Old West, and play it entirely straight. No winking at the camera, no lengthy explanations. Just cowboys and cowgirls operating as early adopters to a new technology and never looking back.

The second spot, Luma Suds, pulls from a deeply personal canon of world-building. Having spent over a decade living in Tokyo, I’ve constantly looked for ways to subvert the gritty, cinematic Japanese crime drama. I wanted to create a laundry detergent commercial with literal life-and-death stakes, living in a dark, criminal underworld where yakuza family members keep lying about the nature of the red stains on their clothes. It’s always that damn “beet root.”

Both of these films were made by me, sitting in front of my laptop, orchestrating every aspect of the generative-AI production.

The era of the one-person studio is a reality now.

Here is a look under the hood at how they came together.

The Blueprint and the Build

Everything started with a script. From there, I wrote a brief outline of the world-building and tone, and uploaded those documents to Luma’s agents.

Then came the visuals. I started by nailing the look of the characters as stills before moving on to the settings. It took a lot of trial and error to achieve the gritty, lived-in, photorealistic worlds I was imagining, but Luma did a pretty incredible job of matching the aesthetic in my head.

As the visuals developed, I moved into Suno to start working on the music. Because each film plays in a very distinct genre, I had clear guardrails for the sound I wanted. I generated around 15 to 20 tracks, picked my favorites, and dropped them into the timeline on my free trial version of Final Cut Pro.

Once I had a music bed and my early generated footage, I started cobbling together a rough cut

The “Pixar Method” of Gen-AI Directing

My AI directing process is heavily modeled after the Pixar method. In Ed Catmull’s book Creativity, Inc., he lays out a rigorous model for iterating and dialing in animated features. You start with very rough sketches cut together into sequences. Over time, the animation is developed and rendered in higher fidelity, replacing the original sketches. The point is that, from a very early stage, you can start to feel the pulse of your characters and the shape of your film.

I worked the exact same way with these edits.

I watched the rough assemblies over and over, diving into the problem areas, tackling the things that bothered me the most, and making those my priorities. This is where taste comes in. Something that only comes from experience lived and absorbed outside of the prompting box.

Honestly, there were moments I desperately wanted to include that the tools just couldn’t pull off. I wanted snappier back-and-forths between the characters, but AI “actors” aren’t quite there yet. So, I pivoted. I shifted my approach and leaned heavily into visual storytelling.

Iterating in Real-Time

Over the course of a week, I just kept staring at my rough cuts. Once the basic blocks were in place, I started finessing the transitions. Was the music connecting us? Were the cuts satisfying? How could I make it more surprising? These are the exact same questions I’ve been asking for the past 20 years while making commercials the traditional way. So many of those instincts and skills carry over directly into AI; the timeline is just massively compressed. It almost feels like you are rewriting the script over and over again until the film is done.

And in AI, there are no “re-shoots.” If you realize you are missing a shot in the edit, you can generate it and plug it in within minutes.

As the cuts got sharper, I zeroed in on the details. How could I land the end card and the product reveal? Were there facial expressions I wanted to try again? Did I need supporting sound effects? Was the footage getting repetitive? It was time to fine-tune and kill my darlings.

Solving for Story

With Luma Taxi, the spot was entirely driven by the narration. I wanted it to feel like an Old West fable, guided by a gravelly, unreliable narrator telling us exactly how things went down. The voice was designed using ElevenLabs. After about seven variations, I found the exact tone and texture I needed, a voice I definitely want to use again in future projects.

I fine-tuned that voice to match the rhythm of the visuals and the music until it felt seamless. If there was a gap that felt too quiet, I’d write a little more VO to connect the thoughts. My original VO ran long (as they usually do), so I just continued to watch the cut and perfect the narration all the way through production.

With Luma Suds, the trickiest part was executing the opening “problem” trigger: the young yakuza spilling wine on the godfather. I spent dozens of generations trying to crack that scene “in-camera,” but the physics and blocking never worked out. Even when I described exactly what I wanted, the AI actor would do something completely out of pocket, like grabbing the godfather’s arm and violently shaking it to spill the wine. Killing the tension and drama of that moment.

Instead of fighting the physics, I leaned into the reaction shots of the partygoers to tell that part of the story. It actually ended up playing up the gravity and consequence of the moment far better than a direct spill would have.

The Era of Post-Permission Cinema

Looking back at these two spots, the biggest takeaway for me is how much of traditional filmmaking still applies. The tools have changed, but the fundamental need for human instinct hasn’t. For two decades, I’ve relied on taste, timing, and problem-solving to make ideas work. Now, I’m applying those exact same muscles to a generative workflow.

We are stepping into an era of post-permission cinema. You no longer need a massive crew, a sprawling location shoot, or a bloated budget to bring a wild, cinematic idea to life and get it all the way to Cannes. If I had my choice and an unlimited budget, I’d always choose the traditional way. But I am aware that change is coming, and new lanes and forms will emerge from these tools. My approach is to get in there early, deeply learn and test the tools, and help push the boundaries of what is possible.

Most of the early AI work I’ve seen has not been for me. At first, I thought it was just for quick memes and fan fiction featuring Elon Musk. Getting deeper into the tools, I now see that you can make whatever you want. The tech bros will continue making their “Hollywood is cooked” spectacles, but when true artists get behind the keyboard, different tones and voices will be unlocked.

It’s an exciting time for makers. Where your ideas can now go directly into production. No client approvals. No messy group meetings watering things down. No uncomfortable compromises. If you have an idea, you can bring it to life in a very final and polished way. It has collapsed time and stacked disciplines in a way we’ve never seen before.

You just need a laptop, a clear vision, and the willingness to iterate until the story clicks.

The barriers are gone. Now, the only thing you need is a great idea.


Andrew “Oyl” Miller is an advertising Creative Director, Copywriter, and AI Film Director. He spent 15 years working at Wieden+Kennedy on brands like Nike, PlayStation, MLB, Amazon, and IKEA—and is now one of the first people to direct a fully AI-generated commercial for broadcast television. You can follow his insights and updates on his newsletter.

Two Commercials I Directed Are Heading to Cannes

My Luma Dream Brief entries “Luma Taxi” and “Luma Suds” are moving forward as real ads and official Cannes contenders.

I got a fun email this morning.

Two commercials I wrote and directed for the Luma Dream Brief have been selected to run as paid ads, and will officially be entered into the 2026 Cannes Lions. From here, any of the selected ads that win a coveted Gold Lion will split a share of a $1,000,000 prize pot.

The past year has been a grind and a blur in the AI film space. What started as a curiosity quickly gathered momentum. 2026 has seen one of my AI-commercials run on ESPN, and now two more are official ads for Luma AI, and heading to Cannes.

Anyway, here are my spots for Luma, now running as paid ads.

The first, Luma Taxi, takes us to the town of Luma where the horses have gone on strike and autonomous vehicles help the Old West mayhem go without a hitch.

And Luma Suds, the generational yakuza epic meets laundry detergent commercial of my wildest dreams.

Under the Hood: The Hybrid Workflow

For both commercials, I used a hybrid workflow I’ve been developing over the past year. AI filmmaking leverages powerful technology, but the best examples I’ve seen are far from the “just push a button” meme you see in your feeds. There is still a lot of room for human authorship and decisions to be made. Here is how these films were built:

  • Writing as Storyboarding: It starts with rough notes and outlining, which then turn into a draft of a script. I usually start storyboarding at this phase, but I like to do it in writing. I’ll write out dense descriptions of the scenes and moments I have in my head, which naturally evolve into the core prompts.

  • Generating the Blocks: Next, I start generating the key sequences. The prompting and results can still be like rolling a pair of dice, so a lot of re-rolling is involved.

  • The Edit: Once I have the basic building blocks, I drop the clips into Final Cut Pro. From there, I start pulling selects and getting the spine of the story into a rough edit. It’s a lot of back-and-forth with the Luma model to fill in the holes and perfect key moments.

  • Setting the Tone: Early on, I go into Suno AI (a generative music model) to start playing around with the tone of the score. I like to get music on the timeline to edit against. Even if it’s rough, I can always switch it out later with a more crafted version.

  • Voice & Character: For the Luma Taxi spot, I created an original AI narrator in ElevenLabs. It had to be gravelly and period-accurate, a voice with enough authority to make even the absurd sound like a black-and-white legend. I was pleased with how it fit the genre and tone I was going for.

I sat with both edits for a couple of weeks, watching them through and making little tweaks to the timing as I lived with them.

With generative AI, there are always moments you wish turned out differently, or frames you wish matched the exact, specific vision in your head. But like any traditional commercial set with a hard deadline, at some point, you have to let go and put it out there.

I’m excited to have the spots out there and running as real ads. It’s all so surreal, and I already have a number of upcoming projects in the pipeline. Stay tuned.

Andrew “Oyl” Miller is an advertising Creative Director, Copywriter, and AI Film Director. He spent 15 years working at Wieden+Kennedy on brands like Nike, PlayStation, MLB, Amazon, and IKEA—and is now one of the first people to direct a fully AI-generated commercial for broadcast television. You can follow his insights and updates on his newsletter.

How I Became an AI Film Director After 20 Years in Advertising

From Nike campaigns to uncanny valley experiments to an AI commercial on ESPN—how Veo, Luma, and pure obsession unlocked a new way to make films without permission.

Still from Luma Taxi Dream Brief AI commercial. Written, directed, and produced by Andrew “Oyl” Miller. 2026.

For a long time, I stayed away from generative AI video tools.

What I saw in my feeds looked like memes and party tricks.

I’ve spent over two decades in advertising, writing and producing commercials and content for brands like Nike, PlayStation, IKEA, Amazon, and more. I had my routines, my established network, and my way of doing things. I’ve collaborated with and had deep creative discussions with legendary directors like Tony Kaye and Frank Budgen. I’ve been lucky enough to glimpse into a dream world of film, and I fell in love with the craft of it all. Yes, tools and taste are ever-evolving, but I deeply fell into a belief that traditional film techniques are the only way.

But then, last year, Google dropped a generative AI video model called Veo 3. I soon saw a random clip of an AI-generated character speaking, with near-perfect lip-sync.

That was the Big Bang of my rocketing journey through the AI film universe.

Suddenly, a lifetime of imaginary characters and dormant stories flashed through my head. Old directionless fragments and shards of ideas in dusty notebooks suddenly had new life. There was so much inside of me that had never found a proper outlet or received the official industry blessing. But when the sky cracked open, and I saw that a cluster of pixels could approximate life and human-ish performance, the writer in me started shaking. I didn’t know exactly where this was going, but I was suddenly, violently compelled to get these ideas out.

Welcome to the Infinite Sandbox in the Uncanny Valley

I started with the low-hanging fruit: Stormtroopers. As a lifelong Star Wars nerd, placing Stormtroopers into our everyday world was a cheap engine for endless gags. They went camping, they went to Cannes, they went to Burning Man. The possibilities were literally endless. It became a meme. Others jumped on.

But soon, I knew I needed to get my own voice out there. I saw AI not just as a way to create blockbuster spectacle, but as a potential platform for unique writing and voice. So naturally, I dipped into the 1980s.

I started thinking about archaic, crusty baseball coaches who hated the modern game. Men triggered by everything, armed with zero self-awareness and iron-clad beliefs from an ancient era. Being in advertising, I knew a funny character wasn’t enough. I needed a platform. I needed world-building.

That became Deadball Academy. Set in present-day Scottsdale, it’s a facility run by a group of coaches stuck in 1984 who bring in modern baseball prospects and corrupt them with deeply backward instruction. It’s a whole universe with lore and bizarre pockets of backstory. I quickly realized there was a LOT to mine here.

The episodes started writing themselves. Sometimes by hand, sometimes as fragmented dialogue and jokes in a notes app. When I strung together enough lines that made me laugh, I started building prompts. It became a new form of mini-screenwriting: establishing a setting, defining a character description, placing a line of dialogue, dictating the delivery, and always defining the cinematic camera look and movement.

Prompt. Prompt. Prompt.

Judge. Re-write. Edit. Curate.

The characters and voices came flooding back. Some were stuck in the uncanny valley; others looked insanely, undeniably good. Nothing was perfect, but it was allowing me to build a rip-o-matic for a cinematic universe that simply didn’t exist before.

Building in Public (and Becoming the Villain)

Whenever an idea outside of Deadball Academy popped into my head, I pursued it. I leaned into Midjourney to test visual styles. I used Suno to tap into my love of songwriting, generating rough, pounding tracks to score my films. Quickly, I was building up a workflow and stack of tools that let me operate a film studio right at my desk.

All the while, I was building in public.

And the internet reacted exactly how you’d expect. I started getting nasty DMs and anonymous trolls flooding my channels. I get it. AI is polarizing, and like it or not, I’ve become the bad guy to some people. But my curiosity, and the voices demanding to be let out of my head, wouldn’t let me stop. Sorry, not sorry. You don’t last in advertising without developing a bulletproof coping mechanism for intense criticism. I just kept pushing. I hear the voices, and the silent judgement, and I keep going.

I’m not looking for your approval. I’m looking for possibilities.

The Trojan Horse and the Million-Dollar Brief

Then, the inbound interest started.

One of those calls turned into writing and directing my first AI commercial, for cybersecurity start-up Proofpoint, which actually aired on ESPN. That is still an insane sentence to write, but it’s internet fact now. I’ve got the receipts. I partnered with the visionary team at ONLYCH1LD, and their openness to this new form was infectious. I even made a bonkers BTS gag reel using the “lead actor” from the Proofpoint spot, CLIFF DATAMAN. Yet another exercise in using the tools for world-building. I just keep leaning into the tangents I find most interesting.

Right around that time, an old Wieden+Kennedy colleague reached out about the Luma Dream Brief.

The AI film contest from Luma AI asked AI directors to use the Luma model to make an ad for a fictional Luma-branded product. Entries would go before a panel of advertising, film, and creative industry legends. Their picks would then be run as real ads for Luma, and officially submitted to Cannes Lions.

I’m not an awards hound, but I recognize how they contribute to career momentum. The thought of creating breakthrough work in an emerging film category was a strong motivation. On top of that, the contest also offered a one-million-dollar prize if the AI commercial ends up winning a Gold Lion.

As someone who has had some of my best work not given the blessing to submit to Cannes and other festivals, for weird, internal political reasons, the idea of no gatekeepers and a chance at entry appealed to me. Gatekeepers in advertising can be brutal. This contest arrived at the exact right time, offering a clean path to submit something with my uncompromised vision directly to Cannes, complete with a shot at a million dollars.

I dove into Luma’s tools and quickly built up a series of spots. What Luma did was validate my deepest belief: the best idea can come from anywhere. Committees, meetings, and endless feedback loops obfuscate that truth. Luma provided a cheat code to circumvent the murky layers of the industry. No feedback. No hidden agendas. No rubber stamps.

Just a clean shot.

If someone wanted to pair a multi-generational yakuza epic with a hard sell for laundry detergent, no one could stand in the way.

What’s Next?

This is where I stand in 2026. Turning a new page, letting my curiosity drive the way.

I will keep pushing, refining, and mastering these tools. But more importantly, I am looking to push beyond advertising. I’m looking to formalize series and put my voice out there in bigger, longer, more ambitious ways.

I have drafts of screenplays and novels waiting in the wings. I now see a world where AI filmmaking bridges the gap between a written page and a green light that I’ve been chasing for years at the end of a long and winding tunnel. Proof of concepts. Opening scenes. Theatrical trailers. That is the new brief.

My mission statement is this: I will keep making things that no one is asking for.

How can I use AI not just to increase my output or be more efficient, but to truly amplify my voice and get my stories made? It’s a crazy dream. It’s a lonely road. But the curiosity and possibilities keep me building. Studio Oyl.

What that means is I’m just a guy at a laptop, letting my fingers do the dreaming.

Andrew “Oyl” Miller is an advertising Creative Director, Copywriter, and AI Film Director. He spent 15 years working at Wieden+Kennedy on brands like Nike, PlayStation, MLB, Amazon, and IKEA—and is now one of the first people to direct a fully AI-generated commercial for broadcast television. You can check out his work on his website.

My First AI-Directed Commercial Just Aired on ESPN

From prompt to national broadcast: Teaming up with ONLYCH1LD to bring Proofpoint AI to life.

Here’s a fun one.

I just directed my first AI commercial to air on broadcast television. No set. No crew. No craft services. Just a brief, a core of smart creatives, and the tools to make it real.

This comes on the heels of CDing a very beautiful, and traditionally crafted spot for MLB Japan, shot on location in Tokyo with the Fridman Sisters and Stink. If I had my choice and an unlimited budget, I would make films the old-school way every time.

Welcome to 2026. With one foot planted in the traditions of film craft, and the other under the desk in my AI-powered portable studio, I bring what I’ve learned from nearly 20 years of experience in advertising to this new frontier.

My early AI experiments, like my original AI sports comedy series, Deadball Academy, have brought me a series of interesting meetings. One with the San Francisco-based production company ONLYCH1LD.

It was from these talks that the opportunity to direct my first broadcast AI commercial popped up. The timeline was aggressive and the plan ambitious. But with ONLYCH1LD as steady, experienced, and enthusiastic partners, we took the plunge and immediately started production on a spot for Proofpoint, a leading cybersecurity company in Silicon Valley.

The vision for the spot was chaotic, comedic, and a little bit unhinged. An AI CEO walks calmly through an office under siege while Proofpoint’s agents extinguish fires, stop robbers, and prevent data theft, all with the energy of a Saturday morning cartoon directed by someone who grew up on Tony Scott.

Working with the team at ONLYCH1LD, the creative came together in a matter of weeks; a timeline that would have been impossible in traditional production. AI didn’t replace the creative process. It compressed it. The brief still needed a point of view. The script still needed a voice. And putting it together required a lot of human conversation.

After spending a couple of weeks “directing” the AI-actor, I had grown a little attached to that cluster of pixels. I even gave him a name: CLIFF DATAMAN. One thing led to another, and soon I had a full-on behind-the-scenes blooper reel, profiling Cliff Dataman’s “on-set” antics.

As I said, things got a little unhinged.

Via Ads of the World — Part of the Clio Network, March 2026:

ONLYCH1LD brings cybersecurity to life in its latest campaign for Proofpoint, turning an office under attack into a chaotic, comedic AI-powered spectacle. The fully AI-driven spot blends absurd humor with bold visual storytelling, showcasing how Proofpoint protects people, data and brands against cyberattacks.

“ONLYCH1LD has been a trusted creative partner for us, and they quickly understood the need to deliver something memorable that balanced humor with clear messaging,” shares Proofpoint CMO Joyce Kim. “Their team proposed a fast, intentionally over the top AI approach that allowed us to move quickly while still creating a bold piece that stands out.”

The campaign came together in just a few weeks following a marketing pivot by Proofpoint’s new CMO, who turned to ONLYCH1LD to reimagine messaging and tone. The :30 broadcast spot, currently airing on ESPN, features an AI CEO casually walking through a chaotic office while Proofpoint’s AI agents prevent data theft, extinguish fires and stop robbers in their tracks. ONLYCH1LD also produced a blooper reel and short clips for social media, giving audiences a look behind the scenes and amplifying the campaign’s absurd, cinematic humor.

“It’s funny, because I wasn’t really ‘on set,’ given a computer made this commercial. But had I been, I would’ve been thrilled with the commitment!” concludes ONLYCH1LD’s ECD Samuel Miller. “In reality, there was no reason to play it safe given the timeline and desire for memorability. We decided to go a bit over the top — or, as Oyl said, ‘bombastic.’ Oyl gave us the freedom to do that while still staying grounded in Proofpoint’s message. It ended up being this fun, controlled chaos — while still fully on brand. And kind of weirdly authentic in its humor.”

Andrew “Oyl” Miller is an advertising Creative Director and Copywriter. He spent 15 years working at Wieden+Kennedy on brands like Nike, PlayStation, MLB, Amazon and IKEA. You can follow his insights and updates on his newsletter.