Facebook removes Iran, Russia networks trying to disrupt US 2020 and other votes

Social media giant announces move as it seeks to step up security, improve its image; Iran-based accounts posted anti-Israel propaganda, company says

Illustrative: A man passes a Facebook screen at the Gamescom gathering in Cologne, Germany, August 20, 2019. (AP Photo/Martin Meissner)
Illustrative: A man passes a Facebook screen at the Gamescom gathering in Cologne, Germany, August 20, 2019. (AP Photo/Martin Meissner)

Facebook said Monday that it had taken down four separate networks of fake, state-backed misinformation-spreading accounts based in Iran and Russia.

The networks included accounts, pages and groups on both Facebook and Instagram that were carrying out “coordinated inauthentic behavior” in violation of the company’s policy, Nathaniel Gleicher, head of the company’s cybersecurity policy, said in a statement.

The networks sought to disrupt elections in the US, North Africa and Latin America, the company said. In the past year, Facebook said it has taken down 50 such clusters of accounts, a sign that efforts to use its services to disrupt elections are not letting up. The company had shared its findings with law enforcement, policy makers and others in the industry, it said.

Three of the networks removed in the latest effort originated in Iran and one in Russia, the company said.

The company removed 93 Facebook accounts, 17 pages and four Instagram accounts that originated in Iran and targeted users in the US. About 7,700 Facebook users followed the pages, while the Instagram accounts had a paltry 145 followers.

Facebook CEO Mark Zuckerberg speaks at Georgetown University, October 17, 2019, in Washington. (AP Photo/Nick Wass)

The people behind the accounts worked to mislead others, mostly in the US, by pretending to be locals of their areas, and mostly posted about local and global politics, including about Israel and the Palestinians, the company said.

The company provided a sample of the content posted by some of the pages it had taken down, including false claims that the Israel Defense Forces had admitted to targeting civilians in Gaza.

An additional 38 Facebook accounts, six pages, four groups and ten Instagram accounts originating in Iran targeted countries in Latin America. The accounts posted repurposed Iranian state media reports on topics including Israel, Hezbollah and Saudi Arabia.

(Pages on Facebook are public accounts that can be managed by multiple people and followed, viewed or liked by anyone and are used, for example, by public figures, businesses or brands. Groups are more personal platforms meant for small group communication, often centered around a common cause, issue or activity.)

A further 50 Instagram accounts with 246,000 followers, 60 percent of which were in the US, originated in Russia and aimed to influence US elections. The company said that the campaign had links to Russia’s Internet Research Agency and appeared to be a well-funded and sophisticated operation.

Illustrative: the Instagram app icon on the screen of a mobile device in New York, August 23, 2019. (AP Photo/Jenny Kane)

The Russia-based effort pushed both liberal and conservative agendas on topics including US elections, the environment, LGBTQ issues and racial tensions.

Facebook’s countermeasures are part of its efforts to step up security ahead of the 2020 US elections to ensure it is not used as a tool to interfere in politics and democracies around the world.

Efforts outlined Monday include a special security tool for elected officials and candidates that monitors their accounts for hacking attempts such as login attempts from unusual locations or unverified devices. Facebook said Monday it will also label state-controlled media as such, label fact-checks more clearly and invest $2 million in media literacy projects.

“Elections have changed significantly since 2016 and Facebook has too,” CEO Mark Zuckerberg said in a conference call Monday. The social network was caught embarrassingly off guard during the 2016 election, having let others use its platform to spread misinformation, manipulate voters and meddle with democracy.

Facebook is under fire from presidential candidates, lawmakers and regulators and privacy advocates around the world for problems ranging from election security to alleged anti-competitive behavior, privacy violations and what many see as its outsize, often negative influence on society. It’s under several antitrust investigations in the US.

The scrutiny from all sides has been ramping up since the 2016 elections. Getting it right in 2020 — or at least preventing a disaster — is crucial for the company. But even if nothing goes terribly wrong, Facebook’s efforts are unlikely to mollify politicians and regulators concerned about its clout.

Global citizens movement Avaaz display life-sized Zuckerberg cutouts near the EU Commission to protest against fake Facebook accounts spreading disinformation on the platform, in Brussels, May 22, 2018. (AP/Geert Vanden Wijngaert)

As part of its efforts to clamp down on misinformation, Facebook said it will add more prominent labels on debunked posts on Facebook as well as on Instagram. It will put labels on top of what are deemed “false” and “partly false” photos and videos.

But Facebook will continue to allow politicians to run ads containing misinformation, and it hasn’t said much about how it handles misinformation spread on its private messaging services such as WhatsApp and Messenger, at least beyond simple measures such as limiting how many times messages can be forwarded.

Critics say Facebook’s measures don’t go far enough and argue that the main problem is its business model, which depends on targeted advertisements and making sure that users stay engaged and entertained. Senator Elizabeth Warren, a leading Democratic presidential candidate and one of Facebook’s biggest critics , has proposed breaking it up.

Facebook also said it will add more information about the people or groups who establish or manage Facebook pages. The company said Monday it has noticed groups and people “failing” to disclose the organizations behind pages, which can mislead users into thinking those pages are independent. Starting with large pages in the US, Facebook said it is adding a new section about “organizations that manage this page.”

Facebook said it will require the pages’ creators to add this information in order to run ads. The rule applies to pages that have gone through the company’s business verification process and to pages that run ads about social issues, elections or politics. If the page creators don’t post this information, they won’t be allowed to advertise.

Most Popular
read more: