Two vehicles chase each other across the countryside in a high-speed race. One is driven by an experienced race car driver, the other one is controlled by a sassy artificial intelligence out to win her first competition. That’s the gist of “The Human Race,” a new branded short produced by The Mill for Chevrolet that prominently features the car maker’s new 2017 Chevrolet Camaro ZL racing a futuristic Chevvy concept car.

But in this case, AI and self-driving cars are just the icing of the cake for tech enthusiasts. “The Human Race” didn’t premier at an auto show Wednesday, but at the Game Developer Conference (GDC) in San Francisco. That’s because Chevvy’s latest car was entirely rendered in real-time, with the help of the Unreal game engine — foreshadowing how movie making could fundamentally change in years to come, thanks to real-time visual effects.

Unreal maker EPIC Games previewed the movie to a small round of journalists Tuesday. After first playing the short film in its entirety, EPIC CTO Kim Libreri handed Chevrolet marketing director Sam Russell a phone to change some of the movie’s aspects on the fly while the movie was running in the background.

Chevrolet marketing director Sam Russell used this configuration screen to change the look of the car in real-time. Courtesy of EPIC Games

Russell pressed a button, and the color of the car in the movie switched from red to blue in real-time, then to yellow, then to silver. Then, he switched the entire make of the car to be shaped like an 1950s Chevvy, all the while the movie was still running on the big screen, with the race continuing with a completely different car. “That was not a regular movie. That was a real-time generated movie,” quipped Libreri afterwards.

A big part of what makes this magic work is EPIC’s Unreal game engine, which computer game developers have been using for years to render their visual assets in real-time, and which Epic now wants to bring to Hollywood.

In the case of “The Human Race,” Unreal was combined with a virtual production toolkit dubbed Cyclops that folks at the Mill have developed in-house. And key to the production was the Blackbird, a special stand-in car that The Mill developed a few years ago to use in commercial shoots where it didn’t have access to the real thing.

The Blackbird a stand-in car developed by The Mill. Courtesy of EPIC Games

The Blackbird, which was featured on stage in San Francisco, looked a bit like a Mad Max vehicle that’s been plastered all over with QR Codes, and featured a number of cameras and other sensors to record information from a drive that can later be applied to a computer-generated rendering.

The Mill has used the Blackbird in the past for other car commercials, using traditional visual effects after the fac to replace the stand-in with the car that companies like Chevrolet wanted advertised, but not publicly seen on the streets during a shoot. The car itself is pretty sophisticated, complete with an electric engine that can be programmed to emulate the torque of any of the cars it is supposed to represent.

“The Human Race” was the first time that the Blackbird was used for real-time rendering — but The Milll’s global director of emerging technology Boo Wong said that the company intends to use this technology for future projects as well. “We would like to do this as soon as possible on as many projects as possible,” she told Variety Tuesday.

A scene shot with the Blackbird… Courtesy of EPIC Games
… and the same scene with the Chevrolet car rendered in real time. Courtesy of EPIC Games

EPIC is set to release a version of Unreal that will support this type of real-time movie rendering later this year, and Wong cautioned that it may take filmmakers a while to adapt to this new approach of visual effects, in part because real-time allows for much less cleanup after the fact. “Our artists have to be incredibly disciplined,” she said. “There is a bit of a mind shift.”

Then again, the switch to real-time visual effects doesn’t have to happen over night. Studios could slowly ease into the use of real-time effects by first using them for on-set visualizations, while still relying on traditional rendering after the fact for the finished project. The benefit of that could be that any images generated during a shoot could be reused for other projects, including AR or VR assets, explained Epic general manager Marc Petit. “We don’t like to throw any pixels away.”

In the end, this could be one of the biggest upsides of real-time effects: They could open up the opportunity for new interactive digital media, including videos that allow viewers to change key assets. In other words: A future in which the viewer can change the color and make of the getaway car is not all that far away.