YouTube, launching a new way to fight the spread of terrorist propaganda, said it has begun rolling out a feature that will redirect searches for specific terrorism-related keywords to a playlist of videos “debunking violent extremist recruiting narratives.”
YouTube is starting the new anti-terrorism feature with a small number of query terms; a rep declined to reveal what those search terms are, or what the anti-terrorism videos are.
For YouTube, the move is part of protecting its bottom line. The service was boycotted by hundreds of advertisers earlier this year, after it came to light that their ads were appearing in videos espousing violent extremism and hate rhetoric. Google has cracked down on ads being served in offensive videos, and outlined several other steps it’s taking to counter terrorist and extremist content. Last month, YouTube joined with Facebook, Microsoft and Twitter to form a new coalition to make their services “hostile to terrorists and violent extremists.”
The new YouTube feature uses the Redirect Method developed by Jigsaw, a think tank and incubator in Alphabet that uses Google products like AdWords and YouTube. Jigsaw collaborated with Moonshot CVE, an organization that uses tech to combat violent extremism, to conduct research to understand how extremist groups use technology to recruit new members. Based on that, Jigsaw created the Redirect Method, which redirects people away from violent extremist propaganda and steers them toward video content that “confronts extremist messages and debunks its mythology.”
YouTube had previously said it was using Jigsaw’s Redirect Method in Europe to use targeted ads to reach potential ISIS recruits, to redirectr them to anti-terrorist videos.
Over the next few weeks, YouTube said is plans to expand the redirect functionality to a wider set of search queries in non-English languages, using machine learning to dynamically update search query terms. In addition, YouTube said it will work with expert nongovernmental organizations to develop new video content designed to counter violent extremist messaging “at different parts of the radicalization funnel.”
“We hope our work together will also help open and broaden a dialogue about other work that can be done to counter radicalization of potential recruits,” YouTube said in a blog post Thursday outlining the new feature.
In an updated to its guidelines for advertiser-friendly content last month, YouTube clarified that videos deemed “hateful” — material already banned under YouTube’s community guidelines — is not eligible for advertising.
Other steps Google says it’s taking to fight terrorist propaganda include: using video-analysis models to ID and remove terrorism-related content; and expanding the number of independent experts in YouTube’s Trusted Flagger program with 50 experts from NGOs in addition to the 63 organizations that are already participating.
In addition, YouTube will demonetize, disable comments for, make ineligible for recommendations, and add interstitial warnings in front of “videos that do not clearly violate our policies” but contain offensive subject matter such as “inflammatory religious or supremacist content,” Google general counsel Kent Walker said in a June blog post. “We think this strikes the right balance between free expression and access to information without promoting extremely offensive viewpoints.”