Oversight Board rules Facebook wrongfully removed 2 Israel-Hamas war posts
Panel urges company to rescind decision deleting posts showing hostage Noa Argamani and aftermath of strike on Shifa Hospital
A quasi-independent review board recommended that Facebook parent company Meta overturn two decisions it made this fall to remove posts “informing the world about human suffering on both sides” of the Israel-Hamas war Tuesday.
In both cases, Meta ended up reinstating the posts — one showing Palestinian casualties and the other, an Israeli hostage — on its own, although it added warning screens to both due to violent content. This means the company isn’t obligated to do anything about the board’s decision.
The board also said it disagrees with Meta’s decision to bar the posts in question from being recommended by Facebook and Instagram, “even in cases where it had determined posts intended to raise awareness.” And it said Meta’s use of automated tools to remove “potentially harmful” content increased the likelihood of taking down “valuable posts” that not only raise awareness about the conflict but may contain evidence of human rights violations. It urged the company to preserve such content.
The Oversight Board, established three years ago by Meta, issued its decisions Tuesday in what it said was its first expedited ruling — taking 12 days rather than the usual 90.
In one case, the board said, Instagram removed a video showing what appears to be the aftermath of a strike on or near Al-Shifa Hospital in Gaza City. The post shows Palestinians, including children, injured or killed.
Meta’s automated systems removed the post saying it violated its rules against violent and graphic content. While Meta eventually reversed its decision, the board said, it placed a warning screen on the post and demoted it, which means it was not recommended to users and fewer people saw it. The board said it disagrees with the decision to demote the video.
The other case concerns a video posted to Facebook of Israeli woman Noa Argamani begging her kidnappers not to kill her as she is taken hostage during the Hamas massacre in Israel on October 7.
Users appealed Meta’s decision to remove the posts and the cases went to the Oversight Board. The board said it saw an almost threefold increase in the daily average of appeals marked by users as related to the Middle East and North Africa region in the weeks following October 7.
Meta said it welcomes the board’s decision.
“Both expression and safety are important to us and the people who use our services. The board overturned Meta’s original decision to take this content down but approved of the subsequent decision to restore the content with a warning screen. Meta previously reinstated this content so no further action will be taken on it,” the company said. “There will be no further updates to this case, as the board did not make any recommendations as part of their decision.”
In a briefing on the cases, the board said Meta confirmed it had temporarily lowered thresholds for automated tools to detect and remove potentially violating content.
“While reducing the risk of harmful content, it also increased the likelihood of mistakenly removing valuable, non-violating content from its platforms,” the Oversight Board said, adding that as of December 11, Meta had not restored the thresholds to pre-October 7 levels.
Meta, then called Facebook, launched the Oversight Board in 2020 in response to criticism that it wasn’t moving fast enough to remove misinformation, hate speech, and influence campaigns from its platforms. The board has 22 members, a multinational group that includes legal scholars, human rights experts, and journalists.
The board’s rulings, such as in these two cases, are binding but its broader policy findings are advisory and Meta is not obligated to follow them.
“These decisions were very difficult to make and required long and complex discussions within the Oversight Board,” Michael McConnell, a board chair, said in a statement. “The board focused on protecting the right to the freedom of expression of people on all sides about these horrific events, while ensuring that none of the testimonies incited violence or hatred. These testimonies are important not just for the speakers, but for users around the world who are seeking timely and diverse information about ground-breaking events, some of which could be important evidence of potential grave violations of international human rights and humanitarian law.”
Meta is not the only social media company to face scrutiny over its handling of content related to the Israel-Hamas war. TikTok has drawn criticism over the prevalence of pro-Palestinian content on the popular video platform. And on Tuesday, the European Union announced a formal investigation into X, the platform formerly known as Twitter, using new regulatory powers awarded last year and following an initial inquiry into spiking “terrorist and violent content and hate speech” after October 7.
Social media platforms have come under criticism over their inability to combat hateful content and misinformation since war broke out on October 7, when Hamas terrorists burst through the border from the Gaza Strip and rampaged through southern Israel communities, murdering some 1,200 people, mostly civilians, and kidnapping some 240 to Gaza.