image captionMore people may vote by post, delaying the election result
Days after banning ads prematurely declaring victory in the presidential election, Facebook has now banned those questioning the US electoral process.
Ads seeking to “delegitimise any lawful method or process of voting” will be banned, amid concerns some may claim postal voting could encourage fraud.
It has also removed Trump-sponsored ads claiming accepting refugees would increase the risk of Covid-19.
They showed opponent Joe Biden talking about the US border and asylum seekers.
And more than 38 versions were seen by hundreds of thousands of people before they were removed.
“We don’t allow claims that people’s physical safety, health or survival is threatened by people on the basis of their national origin or immigration status,” Facebook said.
The pandemic is expected to prompt many more people than usual to vote by post in the election.
Facebook’s Rob Leathern said: “As we get closer to election day, we want to provide further clarity on policies we recently announced.
“Last week, we said we’d prohibit ads that make premature declarations of victory.
“We also won’t allow ads with content that seeks to delegitimise the outcome of an election.”
For example, this would include calling a method of voting inherently fraudulent or corrupt, or using isolated incidents of voter fraud to delegitimize the result of an election. You can find more info and specifics in our Help Center here https://t.co/BPnm1z7LW6 (2/3)
— Rob Leathern (@robleathern) September 30, 2020
Facebook has also banned ads that praise, support or represent militarised social movements and those about QAnon, a wide-ranging unfounded conspiracy theory that suggests President Donald Trump is waging a secret war against Satan-worshipping paedophiles in government, business and the media.
There is increasing evidence QAnon followers are using the issue of child safety, and hashtags such as #savethechildren, to recruit and organise.
And Facebook said it would intervene by directing people to “credible child safety resources” when they searched for certain hashtags.
Facebook’s head of counter-terrorism and dangerous organisations, Brian Fishman, said where President Trump’s “stand down, stand by” comment had been used in support of the Proud Boys, “we’ve removed it”.
But Facebook’s vice-president of global affairs, Nick Clegg, has rejected calls to ban all political ads in the run-up to the vote.
“We block far more political ads than people appreciate,” he said.
“In the second quarter of this year, we blocked around 750,000 political ads from running on our platform which didn’t meet our requirements.
“It’s a long and familiar feature of American democracy that ads are run with great intensity,” he added, describing it as “the lifeblood of democracy”.
Facebook’s restrictions on political ads include a freeze on new ones from 27 October until election day, 3 November, and a ban on those:
- portraying voting or census participation as useless or meaningless
- delegitimising any lawful method of voting, including absentee voting, voting by mail and the lawful collection of ballots
- delegitimising an election or result as fraudulent or corrupt if the results cannot be determined on the final day of voting
- claiming voter fraud is widespread
- claiming the election date or the mechanism for electing the president can be changed in ways not permitted by the Constitution of the United States
- claiming victory prematurely
Has the US election disinformation ship sailed?
Disinformation looking to undermine the democratic process ahead of the US election is an increasing concern – especially with false claims about postal voting promoted frequently by President Trump online.
What happens on social media the day after the US takes to the polls is also a growing worry.
The changes to Facebook’s ad policy will be welcomed.
But whether social-media sites are doing enough to tackle the rise of misleading claims and unfounded conspiracy theories in the weeks before polling day is an entirely different question.
The adverts – which reached millions – remained on Facebook because it does not fact-check political speech, including in ads.
And it’s not just about the official campaigns.
Supporters of the QAnon conspiracy theory have been plugging political disinformation for weeks now, with fears this could affect US voters and be exploited by foreign influence campaigns.
I asked Facebook boss Nick Clegg about this on Tuesday.
And now the site is finally taking further action to tackle this evolution of QAnon.
While this is welcome, many will question if this reactive approach is effective – and whether this is again too late.
Political disinformation plugged by QAnon has already reached local Facebook groups, Instagram feeds and WhatsApp chats.