Facebook’s Lose-Lose Situation in Election Misinformation Battle

Facebook's Lose-Lose Situation Election Misinformation Battle
Adobe Stock/VIP

There’s little room for error in Facebook’s push to root out misinformation campaigns that may sprout up before the final results of the election are known. 

Keep in mind that Facebook already conceded that it was too slow to recognize a 2016 election meddling effort by Russian entities. Separately, lawmakers from both sides of the aisle already want to more heavily regulate the platform, which just weathered major brands boycotting it this summer. 

So, Facebook — thin ice? Yes.  

That’s added pressure for the platform to root out misinformation on an Election Day when voters seem particularly vulnerable to confusion. 

While a lot of consumers are accustomed to finding out results the night of Election Day, the historic amount of mail and early-in person voting due to the virus has created uncertainty on when the winner of this race will be known.  

Battleground states like Michigan and Pennsylvania are aiming to finish counting ballots within three days of Election Day. Other certain states like South Carolina and Wisconsin aim to have results by today or tomorrow.  

That makes it particularly important for Facebook to stamp out any efforts from bad actors with big reach to prematurely declare a winner, which could give rise to the spread of more misinformation or, even worse, incite violence. 

Perhaps that’s why Mark Zuckerberg last week during his company’s Q3 earnings call said this week will “certainly be a test for Facebook.” 

It’s not like Facebook is totally unprepared. In aim of preventing the doomsday scenario just mentioned, the company has guardrails in place to make it clear to voters that results aren’t final yet. 

The company has tripled the size of its safety and security team to 35,000 since 2016 and also now works with third-party fact checkers in a program that laughably started after Facebook received criticism for being used by Russian trolls in 2016. (To be fair, Facebook employed tech to combat misinformation before then, but the third-party fact-checking program began in 2016.) 

Additionally, the company has announced it will temporarily stop running political ads in the U.S. after the polls close today. It also started labeling state-controlled media on the platform. You can find those and a laundry list of election integrity measures Facebook has taken here. 

Of course, Facebook deserves credit for being more proactive this time around, but that list reads just as much as a last-ditch effort as it does a record of sincere attempts to right its wrongs of the previous presidential race. 

That explains why Stanford University Cyber Policy Center’s international policy director Marietje Schaake has signaled that Facebook’s election preparations have been at best ad hoc and haphazard. 

It becomes easier for skeptics to question Facebook’s sincerity when taking into account its history of reacting to PR fires with putting new measures in place (like the fact-checking program; also see the reaction to Trump’s post earlier this year). 

Motives aside, even if Facebook can diagnose misinfo campaigns this week (like the recently dubious Hunter Biden story), it will still eventually fall victim to criticism over its content moderation decisions when the winner of the election is officially announced. 

Should Trump win, Democrats could point to a 2017 algorithm change Facebook made to minimize the presence of political news in News Feed. WSJ in mid October reported that algorithm tweak was dialed up to affect left-leaning sites like Mother Jones after execs became concerned about the disproportionate effect it would have on right-leaning sites (Facebook denies it made changes with specific publishers in mind). 

On the other hand, if Biden wins, some right-wing politicians will just see further evidence of conservative voices being suppressed on Big Tech platforms. Facebook was just hit with such concerns when Zuckerberg testified at the Section 230 hearing. 

And to top it off, public confidence in the platform to do right by them is low. The majority of U.S. adults are wary of tech companies like Facebook being able to prevent election meddling, according to an August Pew Research Center survey. 

Most adults also agree that social media has a “mostly negative” effect on the country, according to a separate July Pew survey. 

This doesn’t mean Facebook has no incentive to combat election misinfo at all — of course it does. 

It just means that attempts to do so may do more in terms of helping it shield itself from regulatory scrutiny (via being able to say it covered its bases) than boosting its public image.