It’s a headline-splashing cricket saga, with former player and current England and Wales Cricket Board director Andrew Strauss pitted against bad boy triple centurion Kevin Pietersen. So among those weighing in on social media on Strauss’s recent decision to deny Pietersen a chance to play for his country was, of course, Adolf Hitler.
Viewed over 70,000 times since its upload to YouTube on May 12, “Hitler finds out that Andrew Strauss won’t select Kevin Pietersen” is just one of thousands of Hitler parodies using “Downfall” (Der Untergang), the Oscar-nominated 2004 German war movie depicting the last days of the Nazi regime.
In this version, the clever subtitled-English text is filled with innocuous cricket jargon railing against the status of the sport today as it relates to Strauss’s leadership. There is no reference to World War II and Jews, or use of anti-Semitic tropes in the almost four minute clip, created with an easy-to-use website.
Often uploaded to the popular Hitler Rants Parodies YouTube channel — viewed by almost 56 million users in six years — the “Downfall” parodies have had a storied history since their 2006 genesis. YouTube originally blocked some versions under copyright infringement claims from the film’s production company, but the company flipflopped and reportedly even began advertising on these pirate, subtitled videos.
But taking into account Strauss’s Jewish family background, and the clip’s dissemination on social media platforms through hashtags such as #HitlerWasRight — in an era seeing a marked uptick in anti-Semitic expression online — when does such satirical content cross the line into anti-Semitism?
— Luke Bennett (@Spinnerluke) May 12, 2015
According to Google and Facebook policy directors, only through users’ input will questionable material even be reviewed. Speaking in Israel at the biennial 5th Global Forum for Combating Antisemitism, Google’s Juniper Downs and Facebook’s Simon Milner joined a panel called “The Oldest Hatred in the Newest Vessels: Toward Solutions” chaired by the US special envoy to monitor and combat anti-Semitism Ira Forman. UK Ministry of Justice head of the Cross Government Hate Crime Programme Paul Giannasi and Prof. Raphael Cohen Almagor rounded out the panel.
Although many governments have legislation against hate speech, there is no unified legislation, making an international product like the world wide web fertile ground for anti-Semitic or other racist individuals and groups to spread their screed.
Governments are becoming increasingly aware of the issue of online hate. In the wake of the jihadist terrorist attacks against journalists and Jews in Paris, The New York Times reported that French Prime Minister Manuel Valls announced last month his government is dedicating €100 million over the next three years to combat racism and anti-Semitism by launching a nationwide awareness campaign, instituting harsher punishments for racist acts, and increased monitoring of online hate speech.
“The Internet has some of the best products of humanity, and some of its worst ones… The only way to combat hate speech is unity,” said Almagor, calling for increased interactive efforts between governments, law officers, and anti-terrorism units, alongside companies and NGOs.
Google’s Downs said the multi-billion dollar corporation struggles to fight for a world without discrimination while maintaining free speech. As stated in its community guidelines, “This can be a delicate balancing act, but if the primary purpose is to attack a protected group [such as a religion, race, or ethnicity], the content crosses the line.”
For Google, ‘context is key’
There is a detailed flagging system, she said, in which content is reviewed 24/7, all around the globe, in respect to both local law and the company’s international standards for hate speech. The reviewers are continuously trained and updated with new forms of hatred. For example, a reviewer in India may be less familiar with the French quasi-Nazi salute, the quenelle. In a recent visit there, Downs said she saw lists of racist jargon and symbols tacked up next to work stations.
Additionally, there are users whom Downs called “trusted flaggers” whose judgment is essential to the company and who may report videos in bulk.
For Google, said Downs, “context is key.” The company uses the acronym “EDSA” to help determine if a post should be blocked from one of its many platforms (YouTube is a Google product), asking whether it educates, documents, or is scientific or artistic.
Like Google’s Downs, Facebook policy director Milner emphasized the crucial role “the community” of users plays in reporting troubling content. With 1.44 billion people on the Facebook platform, Milner said the company sees itself as “the guardians of the community.”
He emphasized that there is a multilingual human staff doing expert review on flagged content, which performs swift “triage” on posts dealing with terrorism or incitement to violence.
‘Users of social media are the first line of defense against online hate’
Echoing statements made by the Google and Facebook representatives, a brochure called “How to Combat Online Antisemitism” that was distributed on seats at the Global Forum stated: “Users of social media are the first line of defense against online hate.”
However, wrote Dr. Andre Oboler, CEO of the Online Hate Prevention Institute, “Reporting online antisemitism can appear complicated. Those who have reported content often become dispirited as too often nothing appears to be done, or worse, they are told their report has been seen and rejected.”
‘We suck at dealing with abuse’
The Twitter platform, with monikers such as @HITLERDIDNUTHIN, offers an anonymity that allows users a way to disconnect from regularly observed social norms of communication.
In the aftermath of a early February Guardian article written by Lindy West depicting her battle with online hate, Twitter CEO Dick Costolo wrote in an internal memo, “We suck at dealing with abuse and trolls on the platform and we’ve sucked at it for years. It’s no secret and the rest of the world talks about it every day. We lose core user after core user by not addressing simple trolling issues that they face every day.” Costolo took personally responsibility for the West fiasco, and vowed to step up efforts.
But in many situations the removal of hate speech gets more publicity than the initial act.
Alluding to what is commonly known as the Streisand Effect, the UK’s Giannasi spoke about the case of Jewish MP Luciana Berger, who was the target of an anti-Semitic Twitter campaign during last summer’s Israel-Gaza war.
“Being seen to take it down is important, but it may not help the problem, and may indeed exacerbate it,” he said. With Berger, it is likely her images with photoshopped swastikas on her forehead “would have been seen by single figures. In the end, they were seen by millions.”
Said Giannasi wryly, “The Internet: The cause of, and the solution to, all of our problems.”
Satire and celebrities bring results
The most effective solution, as proposed by Google’s Downs and Facebook’s Milner, is not more legislation, or increased monitoring. Based on their companies’ experience and outsourced research, the best way to deal with hate speech is “counterspeech.”
‘Counterspeech is a promising strategy in doing hearts and minds work’
Counterspeech, explained Milner, is a way of combating promoters of hate speech through positive speech about the same targeted ethnicity or religion. In soon-to-be-published Facebook-commissioned research done by UK think tank DEMOS, in a comparison with hate speech, counterspeech had significantly more impact as measured by interactions and shares.
Clips with a constructive yet satirical tone, with credible speakers who have “been there” or are celebrities/peers, were shared more and viewed more.
Google’s Downs suggested using satire and comedy to combat racist views, adding that Google is hosting events to bring together successful YouTube producers with credible voices in affected communities. “Counterspeech is a promising strategy in doing hearts and minds work,” she said.
To illustrate her point, she played a moving Anti-Defamation League video depicting what the world would be like without racist murders called “Imagine a World Without Hate.”
The video was greeted with applause. But, drawing audience laughs, a commercial began rolling soon after its conclusion.
And that commercial aspect is exactly why, according to Almagor, companies such as Google and Facebook need to “balance social responsibility with profit responsibility… Having responsibility for content on your service is good for business,” he said.
Ranked by Forbes as the world’s third and tenth most powerful brands, respectively, Google and Facebook are clearly taking note.