Reddit has banned a community dedicated to Fakeapp, an artificial intelligence (AI) video editing application that has gained some notoriety for being used for the production of so-called Deepfake celebrity porn clips. This follows an earlier ban of communities dedicated to the sharing of Deepfake videos.
Reddit users who visited the community Thursday found it banned, complete with a note reading: “This subreddit was banned due to a violation of our content policy, specifically our policy against involuntary pornography.” A Reddit spokesperson declined to comment further.
Deepfakes started to make headlines in December, when a Reddit user had figured out how to use AI to replace the faces of pron actresses with those of celebrities in video clips. In January, another Reddit user published an app that lets anyone without much technical knowledge produce their own Deepfake videos. The app had been downloaded more than 100,000 times by early February.
Reddit initially remained mum on the phenomenon, but rolled out new guidelines specifying the ban of involuntary pornography last week. That’s when the company also took the step to ban a number of communities associated with Deepfakes.
The Fakeapp community was primarily used for discussing the software itself, which has also been used for the creation of non-pornographic clips, and moderators expressively banned the posting of adult content. However, the community’s moderator said Thursday that he had been informed by Reddit administrators that they had found instances of involuntary pornography, which led to the shut-down.
“I sincerely thought I was effectively filtering or manually removing all of that sort of content as it was being posted, but it is possible some slipped in and I didn’t catch it,” he wrote in a statement on Reddit. “I am disappointed but I will respect the admins’ decision.”