×

‘13 Reasons Why’ Posts Get Extra Attention From Facebook Moderators to Prevent Suicides, Leak Shows

A massive leak of content moderation guidelines shows that Facebook is often walking a fine line between free speech and the welfare of its users. Case in point: Facebook’s moderators have been advised to give any reported posts that make mention of the controversial Netflix show “13 Reasons Why” some extra attention to prevent suicides, according to an article in The Guardian.

The British newspaper got its hands on a treasure trove of internal documents meant to instruct content moderators on how to deal with anything from physical threats of violence to revenge porn and the depiction of animal abuse. Some of the documents specifically deal with self-harm and suicide, and the company apparently saw it necessary to update its guidelines after Netflix premiered “13 Reasons Why” in March.

The show tells the story of a teenager who takes her own life — and Facebook was apparently so concerned that it might inspire copycats that it advised all of its moderators to immediately escalate any content related to the show to senior managers, according to The Guardian.

It’s worth noting that this doesn’t mean that Facebook puts anyone who ever posted anything about the show on suicide watch. Facebook moderators only come into play once other Facebook users have flagged a post for review. According to the leaked documents, moderators have been reviewing around 10,400 posts about self-harm during a four-week period of this year.

A very small percentage of these posts lead to Facebook contacting law enforcement. Last year, the site saw 4531 such reports during a two-week period, and alerted law enforcement in 63 of those instances.

Facebook has frequently been criticized for not doing enough to moderate the content posted to its site, criticism that boiled over when video of a homicide was posted to the site by the perpetrator last month. As a response, Facebook CEO Mark Zuckerberg admitted that the service had to do more, and announced the hire of 3000 additional moderators, bringing the total number of staffers reviewing flagged content to 7500.

“Keeping people on Facebook safe is the most important thing we do. We work hard to make Facebook as safe as possible while enabling free speech,” a spokesperson told Variety via email. “This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously.”

The leaked documents show that the services is frequently siding with free speech. For example, in some cases, Facebook opts to keep threats of violence on the site if they’re not found credible, or simply aspirational. Moderators are also advised not to delete media depicting animal cruelty and non-sexual abuse of children, unless the post in question condones those acts.

Facebook is also walking a fine line on self harm, and advising moderators to not take down live streams of users that talk about or threaten self harm. The reasoning, as expressed in one of the guidelines: “We don’t want to censor or punish people in distress who are attempting suicide. Experts have told us what’s best for these people’s safety is to let them livestream as long as they are engaging with viewers.”

Popular on Variety

More Digital

  • Oculus Go Product Shot

    Former Oculus Head of Mobile VR Max Cohen Leaves Facebook

    Early Oculus staffer Max Cohen, who at one point led mobile product for the VR unit, has left Facebook this week, Variety has learned. Cohen’s departure is just the latest staff defection for Facebook’s VR team, which recently also lost its head of VR product, Oculus co-founder Nate Mitchell. A Facebook spokesperson declined to comment. [...]

  • YouTube logo

    YouTube Reportedly to Stop Targeting Ads in Videos for Children

    YouTube is finalizing a plan under which it would no longer serve targeted ads in videos that are popular with kids, Bloomberg reported, citing anonymous sources. The move is intended to appease U.S. regulators that have been investigating whether YouTube violated privacy laws that prohibit online services from collecting data on children, according to the [...]

  • Facebook

    Facebook Looking to Hire Journalists for Curated News Section

    Facebook is looking to assemble a small team of journalists to help curate a news tab within its app. The team of 10 media professionals will complement algorithmic curation, and highlight the top stories of the day, according to a New York Times report. “We are building a new surface for news that will give [...]

  • Best Podcasts: 14 Podcasts You Should

    14 Podcasts You Should Be Listening to

    Here are a few of the informative, funny, thrilling and downright addictive podcasts that deserve your listening attention. Read Variety’s cover story, a deep dive into the podcast revolution with Conan O’Brien, right here. CaliphateThe New York Times Rukmini Callimachi’s reporting about the Islamic State is harrowing boots-on-the-ground journalism at its finest. Conan O’Brien Needs [...]

  • Lee-Daniels-headshot

    Lee Daniels Entertainment Announces Diverse Creative Workshop

    Lee Daniels Entertainment and the online entertainment network Represent by OMV have launched a diversity-focused creative workshop to promote a diverse group of new writers. The workshop will be held from Sept. 25-28 in Los Angeles, with travel provided by American Airlines. Industry professionals will mentor aspiring creatives from underrepresented communities with workshops, meetings with [...]

More From Our Brands

Access exclusive content