search
Interview

Regulation takes time but it’s coming for Facebook, says Israeli law expert

Dr. Tehilla Shwartz Altshuler of the Israel Democracy Institute says that big shifts in data protection and privacy in recent years signal change for social media platforms

Ricky Ben-David

Ricky Ben-David is The Times of Israel’s Startups and Business editor and reporter.

In this October 23, 2019 file photo, Facebook CEO Mark Zuckerberg testifies before a House Financial Services Committee hearing on Capitol Hill, in Washington, DC. (AP/Andrew Harnik)
In this October 23, 2019 file photo, Facebook CEO Mark Zuckerberg testifies before a House Financial Services Committee hearing on Capitol Hill, in Washington, DC. (AP/Andrew Harnik)

Facebook has faced its fair share of scandals in recent years. Between the 2014 reports that the social media giant conducted a secret experiment to gauge the emotional reactions of thousands of users with manipulated information without their consent, to the allegations that it enabled foreign interference in United States national elections in 2016 revealed in the Cambridge Analytica privacy crisis, to the various data breaches and practices that fuel misinformation over factual news, to the accusations that incitement on the platform played a key role in the genocide against the Muslim Rohingya minority in Myanmar, the list is quite extensive and includes many other dizzying crises that impact human lives.

Last week’s high-profile whistleblower testimony, coupled with a six-hour global outage at Facebook, Instagram, and WhatsApp, has shed new light on the company’s alleged wrongdoings, raising the ire of governments and regulators across the world.

The testimony came after weeks of silence from Facebook in the face of a Wall Street Journal series based on internal documents leaked by former Facebook product manager Frances Haugen. She told Congress last Tuesday that Facebook prioritizes its own interests like making more money over the public good, fuels division, harms children, and must be regulated.

And, this time, the company may actually face some consequences — not because of the newest leaks, but because the outage has shown how vulnerable Facebook’s systems are and affected its deep pockets, said Dr. Tehilla Shwartz Altshuler, a senior fellow at the Israel Democracy Institute and head of its Democracy in the Information Age program.

“There have been whistleblowers before and different leaks,” Shwartz Altshuler told The Times of Israel, noting the many scandals, including that “Facebook has been selling the emotional crises of insecure teenagers to advertisers for years.”

“None of these episodes actually caused such a problem for Facebook. Without the outage, I don’t think the leaks will do what the Cambridge Analytica scandal did not do,” she said, referring to the 2018 revelations that Facebook allowed data-mining United Kingdom firm Cambridge Analytica to forage through the personal data of millions of unknowing users.

But since then, there’s been a big shift in data protection and privacy regulations, especially in Europe, said Shwartz Altshuler, an expert on law and technology.

A demonstrator joins others outside Facebook CEO Mark Zuckerberg’s home in San Francisco to protest what they say is Facebook spreading disinformation, on November 21, 2020. (AP Photo/Jeff Chiu, File)

“Things have changed dramatically in the EU where they passed the General Data Protection Regulation (GDPR) in 2018. They are now working very, very hard to complete the legislation of the Digital Services Act, which is going to have a huge effect on social media platforms. They have introduced the AI legislation [The Artificial Intelligence Act], which is going to take another year or so before it’s completed,” she said.

“We’ve seen the same movement in America. Look at the people at [US President Joe] Biden has nominated to the FTC [Federal Trade Commission] and his advisers… a lot of them represent the bigger fight against social media platforms,” Shwartz Altshuler explained.

“Regulations take time. If we want to compare it to the tobacco industry, for example, it took three generations to cope with it. The first generation was when someone could go see their physician and they would say ‘smoke a cigarette, it’ll make you feel better,’ and then the second generation was when there was more realization of the harm. The third generation brought regulation. Because [the tobacco industry] had a huge lobby and legislation takes a lot of time. As a country or state, you need to evaluate different aspects of regulations,” she said.

Facebook has been around now for 17 years and “let’s say the first eight years were paradise, we all thought ‘wow, what an amazing tool’ — and it is amazing — and over the past seven or so years, we’ve realized the harms and it takes time for regulators to react to this.”

In this file photo taken on April 10, 2018, one hundred cardboard cutouts criticizing Facebook founder and CEO Mark Zuckerberg stand outside the US Capitol in Washington, DC. (Saul Loeb/AFP)

To take the tobacco analogy forward, “we are now in the transition between the second and third generation” with regards to Facebook, she added.

This latest leak, she said, “will not effect the big change, but it is another layer in the bigger picture of shifting the attitude toward Facebook and the need for regulation.”

The bigger question is what do we want to regulate, asked Shwartz Altshuler.

“Do we want to regulate privacy, that is, not allow Facebook to target people [with ads]? Do we want to break up Facebook and force it to become a few different companies [by selling off Instagram and WhatsApp] like they did to Bell in the 1970s [splitting Bell System into entirely separate entities]? Do we want to address the algorithms [which control the ordering and presentation of posts and ads] and how will that be done? That’s the beauty of machine learning, it changes over time and evolves. So this talk of algorithmic transparency — we need to understand what that means, and who is going to decide these things,” she said.

Dr. Tehilla Shwartz Altshuler, a research fellow at the Israel Democracy Institute. (Israel Democracy Institute)

Perhaps the right regulatory path should be related to legal responsibility for content removal or exposure to harmful content, she proposed. There are so many questions to be asked and answered, and for regulations to become effective, she estimated that it would take another five to six years.

In the meantime, we need more digital literacy and education and other technological solutions, such as parental controls and startups that may help in tackling harmful content, she said.

To that end — and amid the uproar — Facebook announced on Monday that it will be introducing several features to protect children and teens, including urging them to take a break from Instagram, and “nudging” them if they are repeatedly looking at the same content that’s not conducive to their well-being. It also announced plans to introduce new controls for adults of teens on an optional basis, so that parents or guardians can supervise what their teens are doing online.

In this June 4, 2012 file photo, an unidentified 11-year-old girl logs into Facebook on her iPhone at her home in Palo Alto, CA. (AP Photo/Paul Sakuma, File)

Facebook has said it has done its best to keep harmful content out of its platforms, and has appeared more open to regulation and oversight in recent days.

But for any meaningful change, said Shwartz Altshuler, we have to consider the powerful backing Facebook has.

“Facebook’s stocks weren’t harmed in any previous scandals, not in 2016, not when it was fined any of the times, and that means that the financial sector has always supported Facebook. It was an active partner in preferring money over ethics. So it’s very comfortable for us to point the finger at [Facebook founder] Mark Zuckerberg, but this is a whole industry that is conducting experiments on our democratic systems, on our souls, creating [chaos], and after that apologizing. And it is supported because Facebook brings in a lot of money [for shareholders].”

The markets reacted only when they realized that Facebook has not invested in creating layers of security that led to the outage, which the company blamed on a configuration change, she noted. “Think about it, it’s one person’s mistake that could cause this huge damage, and what if they are not putting enough money in securing the largest personal database in the history of humanity? What if this could happen again? This is what bothers the financial sector. It’s not the harmful influence Facebook has over society.”

“What all the fines and public scandals did not cause, the six-hour outage caused,” she added.

Shwartz Altshuler offered that Israel urgently needs legislation for social media platforms,  including privacy laws that will make it illegal to target users based on their emotional vulnerabilities, requirements to explain decisions to remove content and to delete accounts, provisions making it easier to sue social media providers in Israel, the establishment of a framework via which Israeli courts can quickly issue orders for harmful content to be removed, and requirements to invest resources in monitoring and blocking toxic content in Hebrew.

“Though Facebook claims that it invests considerable resources outside the United States in attempts to block content and delete accounts, this is not evident, if to judge by what happens in practice with Hebrew-language content. There is no engine able to monitor such content, and the responsibility falls on a few content checkers who are expected to handle everything posted in this country on a daily basis. In Israel, we are extremely vulnerable to the influence of political polarization, as the events of the last year have clearly shown. This is why the demand for accountability is so critical in Israel as well,” Shwartz Altshuler wrote in an upcoming op-ed seen by The Times of Israel.

Such measures are reportedly already being considered. On Sunday, Israeli TV reported that a team of government-appointed Israeli experts were expected to examine far-reaching actions to rein in global social media companies and may seek to hold Facebook legally accountable for posts on its platform.

MK Yoaz Hendel at a meeting of the Arrangements Committee in the Knesset, on June 9, 2021. (Olivier Fitoussi/Flash90)

The team, currently being selected by Communications Minister Yoaz Hendel, could seek to compel Facebook to reveal its policies on censorship, banning, and how posts are placed in its algorithms. Currently, when content or users are removed from the platform, Facebook does not have to provide details explaining the move.

The proposed measures also include having social media giants become liable for incitement or libel posted on their platforms, which would be practically unprecedented worldwide, according to the report.

Facebook and other social media sites are currently not legally liable for untrue or harmful content that appears on their platforms, unlike newspapers and other traditional publishers.

Shwartz Altshuler said that the key is in “regulating the framework” and questioning the targeting practices.

“It’s not necessarily the content itself, it’s the question of ‘what is going viral and why vulnerable people are being targeted with its content, so if we create a framework where we pay more regulatory attention to who they can target with specific messages and what data you can collect about people and what they shouldn’t be allowed to collect, that could solve most of the problems,” she concluded.

read more:
comments
Never miss breaking news on Israel
Get notifications to stay updated
You're subscribed