Thanks to the suggestion of Saheb Gulati, I watched the 3rd annual Runway AI Film Festival. It was a short affair—no more than an hour and a half—a collection of short AI-generated films, comprised of even shorter 10- to ~20-second clips stitched together.1
They showed, in the always-brilliant, comically large IMAX screen, a program of winning entries from the competition. Runway (the AI video company) curated the set. As I understood it, the direction, composition, and editing were human; most visuals and much of the audio were AI-generated, though a few entries blended techniques.2
A few observations:
The best filmmakers made full use of the medium. One—JAILBIRD—was filmed from the perspective of a chicken in the UK, rescued from a factory farm and sent to a human prison as a companion animal.3 Another leaned into diffusion-style morphs between shots to tell a strikingly Nolan-esque story (Christopher Nolan being known for time-bending structure) about space-time travelers in an unnamed suburbia. I never understood why my secondary-school art teachers made the “use the medium” point—because it seemed self-evident—but seeing a new form come into its own, and a curated collection of the best that exists, made it click.
The art styles were beautiful and varied. Imagine the care that went into the distinctive look of each frame of Into the Spider-Verse—now think 80% of that, but for absurdist clay characters, colorful African palettes, and Japanese anime. Many films had little plot coherence, but that felt beside the point. As Adam K noted, it’s a lower-level experience, one to be felt. And some of them were beautiful.
AI art is often cheapened because it takes less effort. But, with the tech where it is, we are not yet in an era of Hollywood-level slop-on-demand. The filmmakers poured their hearts into these. And you can see the friction. Look too carefully at the backgrounds and proportions drift. A Japanese man’s facial hair subtly changes and then changes back. The limitations of the form—mainly short clip lengths and inconsistent cross-shot identity—make the films feel rough around the edges.
The Grand Prix winner was a mathematical paean to “Total Pixel Space,” the idea of the space of all possible images and films—clever for this medium—an excuse to show many a brilliant, absurd shot.
It ended with a simple “see you again next year.” Ominous, perhaps.
Most current text-to-video tools generate short clips by default. Runway Gen-3 commonly outputs ~10 seconds with optional extensions up to ~40 seconds; OpenAI’s Sora supports up to ~20 seconds in public interfaces. Scene-to-scene consistency remains a known limitation.
AIFF 2025 screened ten finalist shorts; entries ranged from fully AI-generated to mixed-media workflows. IMAX hosted nationwide screenings with Runway.
JAILBIRD (AIFF 2025 Gold) presents the world through a chicken’s eyes and places the bird in a UK prison as a companion animal; the director describes it as a semi-fictionalized rescue-to-prison narrative.