Facebook is continuing its campaign to try persuade Americans that it’s doing everything it can to ensure the social platform isn’t going to gum up the U.S. elections this November.

Facebook CEO Mark Zuckerberg on Thursday announced additional steps the company is taking to help secure the “integrity” of U.S. elections by encouraging voting, steering users to authoritative information, and reducing the risks of post-election confusion.

Among the new measures: Facebook will not accept new political ads in the week before the Nov. 3 election (although it will allow ads that have already been paid for before then).

One thing Facebook isn’t changing: It will retain its policy of not fact-checking political ads (or posts by politicians), except in cases where those violate its broader prohibitions against disinformation or other community guidelines, a company rep confirmed. (Third-party advertisers like political action committees are “eligible to be fact-checked,” the Facebook spokesman added.)

Other steps include removing posts that claim people will get COVID-19 if they take part in voting and attaching an informational label to content that seeks to “delegitimize” the outcome of the election or discuss the legitimacy of voting methods (for example, by claiming that lawful methods of voting will lead to fraud).

And, according to Zuckerberg, Facebook will take action against anyone who tries to claim victory before the ballots are officially counted or claim that “lawful methods of voting” including mail-in ballots will lead to fraud. President Trump has repeatedly asserted that mail-in voting is susceptible to widespread fraud, despite evidence to the contrary. On Wednesday, Trump suggested that voters in North Carolina vote twice, both by mail and in person; in the United States, it’s illegal to vote more than once.

The Facebook policy that “will definitely apply to the president,” Zuckerberg said in an interview on “CBS This Morning.”

Zuckerberg, in a Facebook post, said the company is teaming with Reuters and the National Election Pool to provide “authoritative information” about election results. That will be displayed in Facebook’s Voting Information Center and the service will proactively notify users as results become available.

“Importantly, if any candidate or campaign tries to declare victory before the results are in, we’ll add a label to their post educating that official results are not yet in and directing people to the official results,” Zuckerberg wrote.

About the ban on new political and issues-based ads in the week prior to the Nov. 3 election, Zuckerberg said advertisers will be able to continue running ads they started running before the final week and adjust the targeting for those ads. But, he added, “those ads will already be published transparently in our Ads Library so anyone, including fact-checkers and journalists, can scrutinize them.”

Meanwhile, Twitter last year said it would ban political advertising (while still allowing some issue-based ads). “We believe political message reach should be earned, not bought,” CEO Jack Dorsey tweeted in October 2019.

The new steps, the CEO said, build on other work Facebook is doing, including urging people to register to vote. It also will launch a national drive later in September to recruit poll workers, and Facebook plans to provide ad credits to election officials “so they can run ads to recruit poll workers where they are most needed,” Zuckerberg wrote.

“This election is not going to be business as usual,” he wrote. “We all have a responsibility to protect our democracy. That means helping people register and vote, clearing up confusion about how this election will work, and taking steps to reduce the chances of violence and unrest.”

Facebook’s goal is to help 4 million people register as voters — and then vote. Zuckerberg noted that he and his wife, Priscilla Chan, have donated $300 million to “non-partisan organizations supporting states and local counties in strengthening our voting infrastructure.”

Zuckerberg also cited Facebook’s recent takedown of a network associated with the Russian-backed Internet Research Agency — the same troll network that spread misinformation on Facebook ahead of the 2016 U.S. election. In addition to foreign actors attempting to meddle in American politics, he said, “we’re increasingly seeing attempts to undermine the legitimacy of our elections from within our own borders.”