Twitter and Reddit are both looking to put an end to the distribution of AI-powered celebrity porn over their platforms. The two companies announced Wednesday that they would ban these so-called Deepfake clips, which use artificial intelligence to superimpose the faces of celebrities onto the bodies of porn performers in x-rated videos.
Reddit banned several communities, or Subreddits, dedicated to the exchange of Deepfake videos Wednesday. The company also updated its rules on involuntary pornography to include a ban on “depictions that have been faked.” In addition, Reddit now also bans the posting of images “for the specific purpose of faking explicit content or soliciting ‘lookalike’ pornography,” according to its updated rules.
The rule change comes after Reddit remained silent for weeks over the growing popularity of Deepfakes on its platform. The phenomenon first surfaced on Reddit in December, when a user started to post videos featuring the likenesses of celebrities that he had generated with the help of artificial intelligence algorithms.
Then in January, Deepfakes got an unexpected boost after another user released software that made it easy to compile such clips without much technical knowledge. Since then, dozens of explicit clips featuring the faces celebrities had been posted on Reddit.
Earlier on Wednesday, Twitter told Motherboard that it was banning Deepfakes on its service as well. The explicit videos had already been banned by Gif hosting site Gfycat and adult video hosting service Pornhub.