Julia Carrie Wong in San Francisco 

A year after Charlottesville, why can’t big tech delete white supremacists?

Facebook, YouTube, Twitter and more pledged to take action against hate groups. Why isn’t it working?
  
  

Many players in the US white supremacists movement have managed to return to major internet platforms.
Many players in the US white supremacist movement have managed to return to major internet platforms. Photograph: Spencer Platt/Getty Images

The purge came too late for Heather Heyer.

Last August, amid the fallout from the deadly Unite the Right rally in Charlottesville, the major technology platforms on which the “alt-right” had arisen, recruited members and organized its violent event decided to clean house.

Facebook and the video game chat application Discord moved to delete the groups and chatrooms affiliated with the violent white supremacist groups present at Charlottesville, where 32-year-old Heyer was killed and dozens were injured when a car plowed into anti-racist counter-protesters. The neo-Nazi website the Daily Stormer was forced to move to the dark web after it lost its domain name registration and ability to fend off DDOS (distributed denial-of-service) attacks.

The horse may have bolted, but the stables were looking decidedly cleaner.

But a year later, as the remnants of the alt-right prepare for an anniversary rally in Washington DC and, pending a court case, Charlottesville, many of the players involved in the first Unite the Right event have managed to return to the major internet platforms – if they ever left in the first place. Facebook, YouTube and Twitter remain platforms for violent white supremacists to broadcast their messages.

“It feels ridiculous,” said Keegan Hankes, a senior research analyst at the Southern Poverty Law Center (SPLC). “I can’t believe we’re having this conversation about Facebook and their falling down on the job over and over again.”

While the violence at Charlottesville came as a shock to many, Facebook would be hard pressed to say that it hadn’t been warned that far-right extremism was festering on its platform.

Months before Charlottesville, the SPLC had provided Facebook with a spreadsheet and links to more than 200 Facebook pages and groups tied to American hate groups. The groups ran the gamut from the old-school hate of the Ku Klux Klan and neo-Nazi skinheads to the newer far-right groups which were flourishing among a generation of internet-savvy racists.

The vast majority of the SPLC-flagged pages remained live when the Guardian reviewed them in July 2017, though Facebook took down nine groups after being queried by the Guardian, including deleting pages linked to the neo-Confederate League of the South. Among the groups deemed by Facebook not to violate its community standards was the Traditionalist Worker party, a neo-Nazi group that had already been involved with violence in Sacramento.

After Unite the Right turned deadly, Facebook took action, deleting the rally’s event page and banning a number of white nationalist groups, including the Traditionalist Worker party.

But a year later, some of the people and groups involved in Charlottesville are back on the world’s largest social network. Until the Guardian contacted Facebook in late July, Jason Kessler, who organized Unite the Right, had a Facebook account, and, according to chat logs obtained by Unicorn Riot, an independent journalism outlet, had been using Facebook Messenger to coordinate Unite the Right 2 with fellow “pro-white” activists. His profile also featured a “donate” button. After a Guardian query, Kessler’s Facebook page was deleted.

The League of the South had re-established 13 Facebook pages for its various state chapters, which were only removed after the Guardian alerted Facebook to their presence. The pages did not use the name “League of the South” in their titles – perhaps to avoid detection by Facebook – but were identified in posts and on the group’s webpage.

Facebook also deleted an Instagram account linked to Patriot Front after the Guardian queried it. Patriot Front is a successor organization to Vanguard America, the neo-Nazi group with whom James Alex Fields – the driver charged with killing Heyer – marched in Charlottesville.

“It doesn’t matter whether these groups are posting hateful messages or whether they’re sharing pictures of friends and family,” a Facebook spokesperson said in a statement. “As organized hate groups, they have no place on our platform.”

The spokesperson touted Facebook’s efforts to detect and combat hate, but did not explain why the company had not acted earlier on the Charlottesville-linked accounts. Facebook did not take action against accounts associated with David Duke, a former Ku Klux Klan leader and Charlottesville attendee, and Augustus Sol Invictus, a headline speaker at the event.

“They’re much more reactive than proactive,” said Hankes. “If you take just a few steps to halfway cover your tracks, you can avoid Facebook’s policies, or find yourself in just enough of a grey area that they won’t ban your group.”

Policing content on Facebook is undoubtedly difficult due to the vast amount of content posted every day. But the company – like other US-based internet platforms – is also hamstrung by its apparent ambivalence toward proactively policing known hate groups versus respecting their right to free expression.

In an interview published on 18 July, Mark Zuckerberg defended a policy of allowing Holocaust denialism on the site, though he later walked back his statement. The company recently announced a policy to be more proactive about deleting misinformation that leads to violence. The policy is being piloted in Sri Lanka and will eventually apply globally, but Facebook said it is still figuring out how it will work.

“We’ve seen more than enough examples of people steeped in alt-right propaganda who go on to commit murder,” Hankes said. “They should recognize that and create policy and enforcement.”

While Facebook and Discord raced to clean house after the violence in Charlottesville, other platforms were less reactive. In November 2017, Twitter faced severe backlash when it verified Kessler’s account. The company subsequently announced new guidelines for account verification, removing blue checkmarks from Kessler, the white nationalist Richard Spencer, and other rightwing extremists.

The crackdown prompted many members of the alt-right to abandon Twitter for the alt-right friendly Gab, though Kessler, Spencer, Duke, and Invictus still have Twitter accounts, albeit unverified ones. (Gab was founded as a “free speech” alternative to Twitter after the rightwing provocateur Milo Yiannopoulos lost his verification on Twitter.) Several of the white nationalist organizations involved in Charlottesville also have Twitter accounts, including Identity Evropa, Patriot Front and the Rise Above Movement.

Controversially, Twitter has verified the accounts of the Proud Boys and their founder, Gavin McInnes. The Proud Boys are something of an alt-right edge case. Considered a hate group by the SPLC, they profess to be a fraternal organization promoting “western chauvinism”. Though the group publicly disavows racism and violence, its members have been involved in a number of violent altercations.

Twitter declined to comment, but pointed to a section of its policy which reads: “Exceptions will be considered for groups that have reformed or are currently engaging in a peaceful resolution process.”

YouTube also largely resisted the public pressure to react to Charlottesville by banning a host of channels. Instead, the company said that it would continue to enforce the policies it already had on the books against extremism – policies that often involve restricting an account’s capabilities rather than banning it altogether.

It is perhaps unsurprising then that a substantial number of groups and individuals involved in Charlottesville have active YouTube channels: AltRight.com, Invictus, Bradley Dean Griffin, Christopher Cantwell, Duke, Identity Evropa, Kessler, the League of the South, Spencer, Spencer’s group the National Policy Institute, the Patriot Front, the Rise Above Movement, and the Traditionalist Workers Party.

After the Guardian contacted YouTube this month, Cantwell’s account was “terminated due to multiple or severe violations of YouTube’s policy prohibiting hate speech”. Videos on the accounts of AltRight.com, Griffin, Duke, Kessler, Patriot Front, the Rise Above Movement and the Traditionalist Workers Party were also stripped of some features, such as comments and share buttons, and placed behind an interstitial warning that the content could be “inappropriate or offensive to some audiences”.

“Hateful content that promotes violence has no place on YouTube,” a spokesperson said in a statement. “We remove videos violating these policies when flagged and we’re investing in machine learning to tackle these challenges more quickly than ever before.”

The one organization that was almost universally banned after Charlottesville was the neo-Nazi publication the Daily Stormer, which had its accounts deleted from Facebook, Twitter, YouTube, LinkedIn, was denied a domain registration by GoDaddy and Google, and lost DDOS attack protection from the internet infrastructure company Cloudflare.

The site was forced onto the dark web for several months, but it has recently reemerged on the normal web, with a “.name” domain.

While the return of the hate site to the surface web may seem like a defeat, Hankes said that the actions taken against the Daily Stormer had been worthwhile, noting that the traffic and number of commenters on the site has fallen “precipitously” since Charlottesville.

To Hankes, the fact that the Daily Stormer still exists but with a much smaller audience is a strong argument in favor of internet platforms denying hate groups access to their tools.

“We’re talking about the difference between a megaphone and the ability to speak aloud.”

 

Leave a Comment

Required fields are marked *

*

*