YouTube wants to make sure it doesn’t recommend 9/11 conspiracy theories to unsuspecting viewers anymore: The Google-owned video service is tweaking its recommendation algorithms to stop recommending “borderline content and content that could misinform users in harmful ways,” it said in a blog post Friday.
This move comes after repeated reports of YouTube adding conspiracy videos and other harmful content to autoplay queues and homepage recommendations. On Thursday, Buzzfeed published an extensive investigation showing that mainstream news content was often followed by recommendations for anti-immigrant propaganda and conspiracy videos.
YouTube likened these new tweaks to previous changes to its algorithms, which at one point heavily favored so-called reaction videos — clickbait content designed to respond to popular videos without actually adding anything to the discussion. “We responded by updating our system to focus on viewer satisfaction instead of views, including measuring likes, dislikes, surveys, and time well spent, all while recommending clickbait videos less often,” YouTube said in its blog post Friday.
YouTube’s new changes against conspiracy content will affect “less than one percent” of the content hosted on the service, and the service is initially just rolling out these changes in the U.S. However, YouTube wants to eventually bring the changes to other countries as well.