A top Facebook executive acknowledged that the social giant’s approach to what it allows on the service — including violence, self-harm, and pornography — has occasionally missed the mark, but added that it’s an enormously complicated task that the company is dedicated to improving.
Monica Bickert, Facebook’s head of global policy management, was responding to the Guardian report Monday disclosing the company’s internal policies and procedures for dealing with a wide range of controversial subjects. According to the newspaper, it obtained more than 100 training manuals and other documents, revealing the difficult and sometimes conflicting problems involved in determining whether to delete certain posts.
According to Bickert, in an essay also published by the Guardian, Facebook strives to be on the right side of balancing competing goals, such as the tension between raising awareness of violence and promoting it. These are “complicated, philosophical questions,” she wrote. “Being as objective as we can is the only way we can be consistent across the world and in different contexts. But we still sometimes end up making the wrong call.”
Earlier this month Facebook CEO Mark Zuckerberg, after several violent videos showing murders and suicides were posted to the site, announced the company would hire 3,000 additional moderators worldwide in the next year to bring its staff that polices posts on Facebook to 7,500.
“We are making these investments because it’s the right thing to do and because we believe that the ability to share is worth protecting,” Bickert wrote.
The scale of the volume of activity on Facebook — more than 1 billion people use the service daily — makes monitoring content hard, and for the company’s reviewers there is the added difficulty is understanding the context behind a questionable post, according to Bickert. Meanwhile, there are rarely clear-cut legal standards about how to deal with such content.
“It’s hard to judge the intent behind one post, or the risk implied in another,” Bickert wrote. “Someone posts a graphic video of a terrorist attack. Will it inspire people to emulate the violence, or speak out against it?”
Facebook’s moderator guidelines sometimes appear to be counter-intuitive. For example, reviewers are allowed to leave up Facebook Live videos of people harming themselves in some cases “so that people can be alerted to help, but to take them down afterwards to prevent copycats,” Bickert wrote. She said two weeks ago a girl in Georgia attempted suicide while broadcasting on Facebook Live — and her friends notified police, who were able to reach her before she killed herself. There have been at least six other cases like that in the past few months, she added.
There are numerous other examples where context plays a critical part in determining what Facebook allows on the service. According to the documents, reviewers are instructed to delete a comment such as “Someone shoot Trump” because that is perceived to be a credible threat. But they’re advised to leave something like “I’m going to kill you” because that’s considered not credible threats but rather “a violent expression of dislike and frustration.”
“We get things wrong,” Bickert wrote, “and we’re constantly working to make sure that happens less often.”