Israel’s Facebook bill may endanger democracy, company official implies
search

Israel’s Facebook bill may endanger democracy, company official implies

Simon Milner, Middle East policy director for the social network, says Knessest should implement checks and balances if it passes law aimed at preventing terror

Mark Zuckerberg, chairman and CEO of Facebook, speaks at the CEO summit during the annual Asia Pacific Economic Cooperation (APEC) forum in Lima, Peru, November 19, 2016. (AP Photo/Esteban Felix)
Mark Zuckerberg, chairman and CEO of Facebook, speaks at the CEO summit during the annual Asia Pacific Economic Cooperation (APEC) forum in Lima, Peru, November 19, 2016. (AP Photo/Esteban Felix)

Israel’s so-called Facebook bill could have wide-ranging implications for its citizens regarding the democratic texture of the nation and its freedom of speech, a senior Facebook official implied in a meeting with Israeli journalists on Wednesday.

“If enacted, the law applies to everything that happens online, whether on a newspaper or a Twitter account, anything,” said Simon Milner, director of Facebook policy for the Middle East, UK and Africa. “It is a big decision. If the Knesset does approve the bill we hope there are appropriate checks and balances for the use of power.”

Israel has accused Facebook of facilitating Palestinian incitement against Israelis, especially following a wave of hundreds of attacks that began in October 2015, which security services said was fueled by online incitement. Earlier this month, the so-called Facebook bill, which would allow the state to seek court orders to force the social media giant to remove certain content based on police recommendations, passed its first reading in the Knesset.

The bill was proposed by Public Security Minister Gilad Erdan and Justice Minister Ayelet Shaked in July, after Erdan lambasted Facebook founder Mark Zuckerberg for allowing Palestinian incitement and hate speech to run rampant on his social media site. Erdan charged that Facebook hinders Israeli police efforts to catch terrorists and declared that Zuckerberg has “some of the blood” of Israeli teenager Hallel Yaffa Ariel on his hands. Ariel was stabbed to death in her bed by a Palestinian teenager who publicized his desire to die for the Palestinian cause in a number of Facebook posts.

Simon Milner, Facebook's director of policy for the Middle East, UK and Africa (Courtesy)
Simon Milner, Facebook’s director of policy for the Middle East, UK and Africa (Courtesy)

The government says the bill will only be invoked in cases of suspected incitement, where there is a real possibility that the material in question endangers the public or national security.

Facebook has already provided its official response to the law, said Milner, expressing concerns that the bill will allow courts to decide on the matters presented before them by the government on an ex-parte basis, without a requirement to hear the other party. “We suggested this get looked at again,” he said.

He emphasized that Facebook has “zero tolerance” for terrorism and incitement to terror, and has clear policies on the matter. The US giant is already working closely with the Israel’s cyber-crime unit to identify problematic posts and to “deal with them promptly,” he said. “A great majority of what they referred to us we have taken down.” But, he added, “there is no magic switch that stops hate speech or terrorism.”

The company would be “very surprised” if the law “were necessary to get Facebook to take down posts,” he said.

Facebook has also come under fire by journalists in Israel for trampling over their freedom of expression by blocking their posts. Earlier this month, Hamas also slammed the social media giant for closing down over a hundred pages belonging or sympathetic to the terror group in control of the Gaza Strip.

Milner said that Facebook calls on users to flag offensive posts and “well trained” and multilingual teams based in Melno Park, Austin, Hyderabad and Dublin decide whether to take down a post. “So we have 24/7 coverage,” he said. The Dublin team deals with the Middle East. When news and events happen, the teams enter alert mode, on the lookout for problematic posts and Facebook sharing. “Generally, we turn around reports within 24 hours,” he said. The main priorities are problematic issues like child sexual abuse or the fanning of terrorism acts.

“But we do make mistakes,” he admitted. “We have 1.8 billion people, the biggest community on earth. We use people to assess reports and people make mistakes. It is all about judgment calls.”

Much depends on the context of the text, he explained. Sometimes a word used in a hate speech context can be used in satirical context, and that is when mistakes can be made. “We learn from these mistakes,” he said. “But the sheer numbers mean that there are mistakes to be made.”

Artificial intelligence technology is not sophisticated enough yet to discern context, he said. “The focus at the moment is to use enough well trained people,” and learn from previous mistakes, he said.

read more:
less
comments
more