How does the adult entertainment industry feel about artificial intelligence (AI) technology that makes it possible to easily doctor porn videos, and exchange the faces of adult film actresses with those of Hollywood stars and other celebrities? The answer, in one word: conflicted.
That’s not just because their X-rated clips are at the center of the Deepfake AI porn phenomenon, which has been making waves online in recent weeks. The adult entertainment industry has also been both benefiting from and struggling with the impact of new technologies, and learned the hard way how challenging it can be to put the genie back in the bottle.
“I think it’s f***ed up,” said Adam Grayson, chief financial officer of adult production company Evil Angel when asked about the Deepfake phenomenon. “Everything we do at Evil Angel and in the core of the adult industry is built around the word consent. Deepfakes by definition runs contrary to consent.”
Deepfake porn first surfaced on the internet in late 2017, when a Reddit user started to post clips that he had compiled with his own home computer. These clips showed X-rated action, seemingly starring some of Hollywood’s biggest names. To compile them, the user had downloaded thousands of photos of celebrities from the web and trained AI algorithms against these data sets, which enabled the replacement of the faces of porn stars with computer-generated faces of celebrities.
The results were convincing, scary, and enticing enough for thousands of users to flock to dedicated Deepfake Reddit communities to examine videos purporting to feature the likes of Emma Watson and Daisy Ridley.
Then, someone released an app to easily compile these kinds of clips at home. Dozens of new clips surfaced within days, and the app was downloaded more than 100,000 times by the beginning of February, according to its developer. A number of online platforms, including Reddit and Twitter, have since banned Deepfakes, but the purveyors of these videos have since found other places to swap them.
For some in the industry, all of this feels a bit like a deja vu. “This whole issue of Deepfakes — it is not new,” argued Corey Silverstein, whose law firm regularly represents adult film studios and distributors. “Fake celebrity porn has been around since the 1990s.” Back then, it was more of an issue of photoshopped stills as opposed to AI-doctored videos, but Silverstein argued that there is nonetheless a legal avenue against the distribution of Deepfakes.
Experts disagree on what kind of legal remedies celebrities have. That’s in part because Deepfake creators don’t simply copy one single photo, but use thousands of pictures to create the likeness of an actress, making it harder to pursue copyright claims. “It will be a battle,” admitted Silverstein, but he argued that this battle could be won.
Key to such a win could possibly be the adult industry itself, since the porn scenes used for Deepfakes are often much easier to identify. Celebrities could ask porn studios to have clips taken down on their behalf, said Silverstein. “The adult entertainment industry has always had their doors open.”
From a Deepfake adult video that’s been generated with images of “Star Wars” actress Daisy Ridley.
Xavi Clos, head of production of virtual reality porn company BadoinkVR, seemed open to such a cooperation. “If there is ever an instance though where someone’s privacy is being violated, or the law is being broken, we will absolutely take the necessary steps to take down any such clips.”
“If a celebrity called us asking for help, of course we’d do everything in our power,” promised Grayson as well. But he was more skeptical about the effectiveness of such a collaboration. “As hard as it is for us to solve our own piracy problems via takedowns, I’d be really skeptical if our IP rights of the underlying content would be helpful.”
“Given the rogue nature of Deepfakes, I don’t see how it can be effectively stopped,” agreed Alec Helmy, president and publisher of the adult industry publication Xbiz. Producers of adult content were already having a hard enough time getting pirated copies of their own content removed from the web, he said. “I would imagine Deepfakes being an even bigger problem.”
That being said, some believe that the adult industry could ultimately make use of Deepfake technology as well. “Deepfakes address a specific need in the market,” said Clos. “People could potentially use this for non-celebrities, perhaps a partner. The key thing is getting their permission.”
“There’s an interesting world of artificial CGI porn that will be happening the next decade, where a fan can easily put any face on any body in a porn scene,” agrees Grayson. “But I think of that for personal consumption rather than public humiliation.”
Chances are, this will happen sooner rather than later. A representative for the adult film studio Naughty America didn’t reply with any answers to Variety’s questions about Deepfakes, but instead decided to email a link to a porn video produced by the studio — complete with a Deepfake version to boot.