The developer has paused all of its pre-roll advertising, which plays before a video starts on the streaming platform, a spokesperson said. It also reached out to Google to determine what actions, if any, the company will take to remove the exploitative content from its service.
Former YouTube creator Matt Watson raised concerns about the content days ago in a YouTube video and Reddit post. In his video, he demonstrated how a search for something seemingly innocuous like “bikini haul” — videos where women show off various swimsuits they’ve purchased — can lead to disturbing examples of child exploitation. Although the clips aren’t pornographic in nature, the predators allegedly timestamp parts that unintentionally sexualize the children in the scene and share them in the video’s comments section.
“YouTube’s recommended algorithm is facilitating pedophiles’ ability to connect with each-other, trade contact info, and link to actual child pornography in the comments,” Watson wrote on Reddit. “I can consistently get access to it from vanilla, never-before-used YouTube accounts via innocuous videos in less than ten minutes, in sometimes less than five clicks.”
Although Watson doesn’t call out “Fortnite” specifically, he does show various examples of the products and companies being advertised before the videos, including Grammarly, GNC, and Google’s own Chromebook.
A YouTube spokesperson told The Verge it took immediate action by deleting accounts and channels. It also reported illegal activity to the authorities and disabled what it called “violative comments.”
“Any content — including comments — that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube,” the spokesperson said. “There’s more to be done, and we continue to work to improve and catch abuse more quickly.”
Variety contacted Epic Games to find out if it plans to resume advertising on YouTube, but it couldn’t say when, or if, that will happen.