×

Hardly a week goes by without conspiracy theories finding their way into YouTube’s top search results. The Google-owned video service has been trying to fight back by banning accounts like those of Infowars.com and promoting authoritative sources, but a new study suggests that many far-right YouTubers are largely unaffected by those measures. That’s in part because these video producers are using classic influencer tactics to promote their views across a tight-knit network.

To figure out how far-right YouTubers reach audiences on the platform, Rebecca Lewis from the New York-based nonprofit Data & Society recently analyzed a network of 61 influencers across 81 channels. Some of the YouTubers included in Lewis’ analysis are widely known as fringe right-wing activists, including Milo Yiannopolous, Richard Spencer, Mike Cernovich and Gavin McInnes.

Lazy loaded image
Courtesy of Data & Society

A subset of the YouTube personalities analyzed for Data & Society’s new report.

However, Lewis quickly noticed that these right-wing YouTubers frequently appeared on other shows, including some hosted by more mainstream conservative and libertarian YouTubers. That’s why she also included YouTubers like conservative commentator Candace Owens, comedian Joe Rogan and libertarian talk show host Dave Rubin in her analysis.

Plotting out the guest appearances of far-right YouTubers, she discovered a tight-knit “alternative influence network.” “Many of these YouTubers are less defined by any single ideology than they are by a ‘reactionary’ position: a general opposition to feminism, social justice, or left-wing politics,” she wrote in her report.

In that network, self-proclaimed libertarian and conservative YouTubers play a key role in promoting racist and white nationalist views. That’s because in many instances, they host members of the far right on their shows without effectively challenging their points of view. “There is a really slippery nature to this,” Lewis said during a media call Tuesday.

Across the network of YouTube channels analyzed by Lewis, she found a few recurring motives: Far-right YouTubers regularly cast themselves as an alternative to mainstream media, and they paint a picture of political persecution against conservative and far-right points of view.

But while that may not sound new to anyone who has ever watched conservative talking heads on network TV, there is something novel in the way far-right YouTubers propagate these views on the service: They’re relying on the very tactics used by beauty bloggers and other influencers on YouTube. From the report:

“One of the most effective ways to network on YouTube is by referencing and including other people in video content. In fact, how-to manuals for building influence on YouTube often list collaborations as one of the most effective strategies.”

The challenge for YouTube is that it is much harder to police thee types of influencer techniques. Far-right YouTubers know to avoid words that would get them banned, and their guest appearances on popular shows help to bring their ideas to millions of viewers. “YouTube has a much more complex problem to solve around trust and content,” said Data & Society’s media manipulation research lead Joan Donovan.

However, the report also suggest that there are concrete steps that YouTube could do. Instead of just punishing white supremacists when they cross the line, the service should also take steps against the shows giving these fringe voices a podium.

“The platform should not only assess what channels say in their content, but also who they host and what their guests say,” Lewis wrote in her report. “In a media environment consisting of networked influencers, YouTube must respond with policies that account for influence and amplification, as well as social networks.”

YouTube responded Tuesday by pointing out that it has implemented stricter rules on monetization since the conclusion of the report:

“YouTube is an open platform where anyone can choose to post videos to a global audience, subject to our Community Guidelines, which we enforce rigorously. Additionally, we’ve made updates over the past year to tighten our monetization policies and improve our enforcement against hate speech. Since this research concluded in April 2018, we’ve made more updates to which channels have access to monetization features and deployed advanced machine learning technology to tackle more hate speech in comment features. We continue to expand this work and appreciate input from all researchers.”

Update: This post was updated with a response from YouTube.