YouTube Tweaks Algorithms to Stop Recommending Conspiracy Videos

YouTube office
Jenny Tobien/picture-alliance/dpa/AP Images

YouTube wants to make sure it doesn’t recommend 9/11 conspiracy theories to unsuspecting viewers anymore: The Google-owned video service is tweaking its recommendation algorithms to stop recommending “borderline content and content that could misinform users in harmful ways,” it said in a blog post Friday.

Some of the videos affected by the algorithm changes include “videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.” Those videos don’t necessarily violate YouTube terms of use, which means that the site won’t remove them. Users will be able to still find any content identified as borderline via search, and on the channels publishing that kind of content.

This move comes after repeated reports of YouTube adding conspiracy videos and other harmful content to autoplay queues and homepage recommendations. On Thursday, Buzzfeed published an extensive investigation showing that mainstream news content was often followed by recommendations for anti-immigrant propaganda and conspiracy videos.

YouTube likened these new tweaks to previous changes to its algorithms, which at one point heavily favored so-called reaction videos — clickbait content designed to respond to popular videos without actually adding anything to the discussion. “We responded by updating our system to focus on viewer satisfaction instead of views, including measuring likes, dislikes, surveys, and time well spent, all while recommending clickbait videos less often,” YouTube said in its blog post Friday.

YouTube’s new changes against conspiracy content will affect “less than one percent” of the content hosted on the service, and the service is initially just rolling out these changes in the U.S. However, YouTube wants to eventually bring the changes to other countries as well.