Zuckerberg, in a post Friday on Facebook, said while Trump’s message included a “troubling historical reference” — specifically, the phrase “when the looting starts, the shooting starts” — and while Zuckerberg personally disagreed with the president’s inflammatory rhetoric, Facebook decided to not remove it to “enable as much expression as possible.”
Over the weekend, Facebook staffers took to Twitter to voice disagreement with Zuckerberg, with some lauding Twitter’s decision to add a warning label in front of Trump’s tweet because it broke Twitter rules against glorifying violence. The controversy comes as U.S. cities nationwide were engulfed by ongoing civil unrest sparked by the murder of George Floyd, a black man killed by Minneapolis police officers last week.
On Monday, some Facebook workers staged a virtual walkout in protest over the issue.
“There isn’t a neutral position on racism,” Facebook design manager Jason Stirman said on Twitter. “I’m a FB employee that completely disagrees with Mark’s decision to do nothing about Trump’s recent posts, which clearly incite violence. I’m not alone inside of FB.”
“Mark is wrong, and I will endeavor in the loudest possible way to change his mind,” Ryan Freitas, director of product design for Facebook’s News Feed, wrote in a tweet. He said he “focused on organizing 50+ like-minded folks into something that looks like internal change.”
The Trump comment in question — cross-posted May 29 to Twitter, Facebook and Instagram — that “when the looting starts, the shooting starts” is the exact phrase used by Miami’s racist police chief in 1967 when he spoke about violently suppressing civil unrest in black neighborhoods. It was also used by rabid segregationist George Wallace during Wallace’s 1968 presidential campaign. Trump later claimed to be ignorant of the phrase’s racist history.
Trump’s “looting-shooting” comment “encourages extra-judicial violence and stokes racism. Respect to @Twitter’s integrity team for making the enforcement call,” David Gillis, a Facebook director of product design, tweeted Sunday. He added that “when we have to vigorously debate whether to make an exception to the way we interpret and enforce a given policy (as happened on Friday), this often indicates that said policy needs to evolve. I think that is the case here.”
And Jason Toff, a Facebook product management director, wrote in a tweet Sunday night: “I work at Facebook and I am not proud of how we’re showing up. The majority of coworkers I’ve spoken to feel the same way. We are making our voice heard.”
“Disappointed that, again, I need to call this out: Trump’s glorification of violence on Facebook is disgusting and it should absolutely be flagged or removed from our platforms,” Brandon Dail, a user-interface engineer at Facebook, said Friday. “I categorically disagree with any policy that does otherwise.”
In a statement on the employee protests, a Facebook spokesman said in an email, “We recognize the pain many of our people are feeling right now, especially our Black community. We encourage employees to speak openly when they disagree with leadership. As we face additional difficult decisions around content ahead, we’ll continue seeking their honest feedback.”
Meanwhile, on Friday, Trump called Zuckerberg, during which the Facebook chief “expressed concerns about the tone and the rhetoric” of the president’s “looting and shooting” remark, Axios reported. Citing anonymous sources, the Axios report said that Zuckerberg, while he “didn’t make any specific requests,” told Trump he was “putting Facebook in a difficult position.”
Amid the violence and protests that have erupted over Floyd’s killing, Zuckerberg said in a post at 10:05 p.m. PT Sunday that Facebook is committing $10 million “to groups working on racial justice.” Facebook joined a chorus of other companies supporting the Black Lives Matter cause.
Instagram, in a post about the donation, said, “Time and time again, we have seen that the Instagram community has the power to bring about meaningful change. The more we #ShareBlackStories, the more we raise voices that make a lasting impact. To continue that impact, @facebook is pledging $10 million to efforts committed to ending racial injustice. #BlackLivesMatter.”
In his post last Friday, Zuckerberg said Facebook “very closely” evaluated Trump’s post about whether it violated policies. He said the president’s reference to the deployment of the National Guard in Minneapolis weighed in the decision to leave up the post because “we think people need to know if the government is planning to deploy force.”
Zuckerberg has previously said the company will not fact-check political speech, including political ads. That’s in contrast to Twitter, which has said it will continue to fact-check information while the company has decided to stop accepting political advertising.
“We have a different policy I think than Twitter on this,” Zuckerberg said in a Fox News Channel interview that aired May 28. “You know, I just believe strongly that Facebook shouldn’t be the arbiter of truth of everything that people say online… We’ve been pretty clear on our policy that we think that it wouldn’t be right for us to do fact-checks for politicians.”
Facebook’s policy about violent speech, according to its Community Guidelines, says, “While we understand that people commonly express disdain or disagreement by threatening or calling for violence in non-serious ways, we remove language that incites or facilitates serious violence. We remove content, disable accounts, and work with law enforcement when we believe there is a genuine risk of physical harm or direct threats to public safety.”