pull down to refresh

Behold Neural Viz, the first great cinematic universe of the AI era.
The filmmaker could not get Tiggy the alien to cooperate. He just needed the glistening brown creature to turn its head. But Tiggy, who was sitting in the passenger’s seat of a cop car, kept disobeying. At first Tiggy rotated his gaze only slightly. Then he looked to the wrong side of the camera. Then his skin turned splotchy, like an overripe fruit.
The filmmaker was not on a movie set, or Mars. He was sitting at his home computer in Los Angeles using a piece of AI software called FLUX Kontext to generate and regenerate images of the alien, waiting for a workable one to appear. He’d used a different AI tool, Midjourney, to generate the very first image of Tiggy (prompt: “fat blob alien with a tiny mouth and tiny lips”); one called ElevenLabs to create the timbre of Tiggy’s voice (the filmmaker’s voice overlaid with a synthetic one, then pitch-shifted way up); and yet another called Runway to describe the precise shot he wanted in this scene (“close up on the little alien as they ride in the passenger seat, shallow depth of field”).