search

9% of YouTube users have viewed content from ‘extremist’ channels — survey

ADL study says white supremacist, racist content is ‘disturbingly accessible’; another 22% have seen at least 1 video from a channel that could be ‘a gateway to extremist content’

People wearing hats and patches indicating they are part of Oath Keepers attend a rally at Freedom Plaza Tuesday, Jan. 5, 2021, in Washington, in support of President Donald Trump. (AP Photo/Jacquelyn Martin)
Illustrative: People wearing hats and patches indicating they are part of Oath Keepers attend a rally at Freedom Plaza Jan. 5, 2021, in Washington, in support of President Donald Trump. (AP Photo/Jacquelyn Martin)

A study published by the Anti-Defamation League on Friday has found that despite widespread efforts to remove extremist content from the YouTube platform, some nine percent of users say they have viewed content from a channel defined as “extremist.”

According to the survey, which collected data from 4,000 respondents to a YouGov America online panel, another 22% viewed at least one video from an “alternative” channel that could “serve as a gateway to extremist content.”

The report also found that although some high-profile extremist channels have been removed by YouTube, “white supremacist and other alternative and extremist content remained disturbingly accessible” on the platform.

“These findings further support the need for platforms to remove violent extremist groups and content, including conspiracy theories like QAnon that fueled the Jan. 6 assault on the US Capitol,” the report said.

ADL chair Jonathan Greenblatt said upon publication of the report that it remained “far too easy for individuals interested in extremist content to find what they are looking for on YouTube over and over again”

Jonathan Greenblatt, CEO and national director of the Anti-Defamation League, speaks on Capitol Hill in Washington, May 2, 2017. (Carolyn Kaster/AP)

“Tech platforms including YouTube must take further action to ensure that extremist content is scrubbed from their platforms, and if they do not, then they should be held accountable when their systems, built to engage users, actually amplify dangerous content that leads to violence,” he said.

In 2017, Google’s video-sharing platform took a tougher stance against supremacist content, limiting actions such as sharing, recommending and commenting on clips. In 2019, it announced that it would also remove material that denies the Holocaust or glorifies Nazism.

Major social platforms cracked down on the spread of misinformation and conspiracy theories in the leadup to the presidential election, and expanded their efforts in the wake of the January 6 Capitol riot.

YouTube banned QAnon in October and has been cracking down on accounts amplifying unfounded claims that Trump was fighting deep state enemies and cannibals operating a child-sex trafficking ring.

In this file photo conspiracy theorist QAnon demonstrators protest child trafficking on Hollywood Boulevard in Los Angeles, California, August 22, 2020 (Kyle Grillot / AFP)

The video platform also axed “Bannon’s War Room,” a channel run by Trump loyalist Steve Bannon, on January 8 after he spread false election claims and called for the beheading of Dr. Anthony Fauci, the top US infectious diseases expert.

YouTube additionally removed several well-known white supremacists from its platform last year, including Nick Fuentes, Richard Spencer and David Duke.

read more:
comments
Never miss breaking news on Israel
Get notifications to stay updated
You're subscribed