Facebook has stepped up its quest to improve its public image after recent PR nightmares, including the mid-2020 advertiser boycott over the platform’s handling of hate speech and more recent published evidence suggesting Facebook contributed to the Capitol insurrection.
One of Facebook’s latest ideas is forming an election commission, which would help the platform make decisions in areas including election misinformation and political ads, The New York Times reported in late August.
Facebook is reportedly expected to announce the new advisory body this fall, though the effort could still fall apart because talks surrounding the initiative are preliminary.
The potential election commission sounds fairly similar in nature to the Oversight Board, which is tasked with reviewing whether certain content takedowns by Facebook are warranted.
But the NYT suggests a key difference between the two is the election commission would proactively provide opinions to Facebook on election content-related matters, whereas the Oversight Board was built as a body that responds to content decisions the social platform has already made.
Even if a newly formed election commission is tasked with proactively providing Facebook guidance on tough content-moderation decisions starting in the fall, it seems fair to assume goodwill toward Facebook probably won’t change too dramatically as a result.
For one, it’s not like there’s ever a shortage of advice for Facebook from well-informed individuals whenever a big scandal occurs for the platform (like when professors from UNC Chapel Hill, Harvard and Stanford chimed in on the Trump suspension decision). That likely will cause cynics to view an election commission as a deflection tactic rather than an earnest attempt by Facebook to clean itself up in an impartial manner.
Meanwhile, look at what massive content-moderation PR headaches Facebook has still faced even while the Oversight Board has been operational.
Even after Facebook deferred the Trump suspension case to the Oversight Board in late January, a February letter to the board signed by GOP reps read, “Instances where conservative viewpoints have been censored, blocked, or diminished harm the free exchange of ideas and irreparably damage conservative Americans’ faith in the fundamental fairness of purportedly neutral actors like Facebook.”
And when Facebook decided to suspend Trump for two years in response to the Oversight Board rejecting the social platform’s indefinite suspension time frame for the former president, many consumers on both sides of the aisle still were unhappy, albeit for different reasons (with those on the left feeling two years isn’t enough, while those on the right further were decrying censorship).
Also remember how The New York Times earlier in August reported there is now a rift between Facebook and the White House over the platform’s spread of vaccine misinformation.
What likely would be significantly more effective than an election commission in boosting Facebook’s public image is becoming more consistently transparent on how it’s tweaking its News Feed algorithm.
For example, the fact that Facebook reversed a News Feed tweak in late 2020 that lifted more authoritative news sources likely left a bad taste in some consumers' mouths.
These consumers might be open to using Facebook more often if they received more updated information on how Facebook was boosting credible news in News Feed.
If not, Facebook will continue to receive criticism from the left and right, with criticism of unfair censorship remaining loud.
After all, the number of Republicans or independents who lean Republican and believe it’s somewhat or very likely social media censors certain political viewpoints has already increased, albeit slightly, from 2018 to 2020, according to a June 2020 survey by the Pew Research Center.