As the death toll rises from shootings carried out by people who've espoused white nationalist and white supremacist ideas, some Democratic presidential candidates are calling for social media companies to more forcefully crack down on hateful content on their platforms.
Their goal is to make it harder for groups or individuals to coordinate their activities and circulate content targeting racial, ethnic and religious minority groups.
But the candidates' furthest-reaching proposals to stop extremists from weaponizing the Internet have also sparked constitutional concerns over how far the government can go in policing online speech.
Social media have long been a crucial tool for extremist groups to disseminate their rhetoric and recruit new supporters -- both on fringe right-wing platforms like 8chan and in dark corners of more mainstream networks.
The organizers of white nationalist protests in Charlottesville, Va., in August 2017 that left one woman dead organized their event with Facebook group chats, as well as on the Discord chat app. And the man who killed 11 people at a Pittsburgh synagogue last October had a history of sharing content and interacting with high-profile white nationalists on the right-wing social media network Gab in the year before his attack, an analysis by the Southern Poverty Law Center found.
The El Paso, Texas, gunman cited the massacre at a mosque in Christchurch, New Zealand, in a manifesto he posted online. The Christchurch shooter, in turn, cited a 2011 mass murder in Norway carried out by an Islamophobic extremist -- a sign of how one attack can inspire others in a deadly cycle.
"White supremacists are using social media to connect and spread their hate and evil to others," said Farah Pandith, a former State Department official who focused on fighting violent extremism and has written a book on the subject. "The tech companies have been slow to act and limited in their scope -- we have to be realistic about the importance and seriousness of this threat."
The terms "white nationalists" and "white supremacists" are often used interchangeably. However, white nationalists are more generally opposed to what they call multiculturalism, while white supremacists follow an ideology that white people have superior genetics and culture.
Since Sept. 11, 2001, more people have been killed on U.S. soil by domestic right-wing terrorism than by jihadism, according to statistics compiled by the New America think tank. Experts say many of those attacks appear to be fueled by strands of the same racist ideology that white people are being "replaced" by members of minority groups or foreigners.
Former Rep. Beto O'Rourke of Texas, an El Paso native, would go furthest of the presidential candidates in rethinking legal protections for social networks.
Currently, social media companies are insulated from lawsuits about content posted by users, under Section 230 of the Communications Decency Act -- a provision that's been called "the 26 words that created the Internet."
O'Rourke's proposal would strip that legal immunity from large companies that don't set policies to block content that incites violence, intimidation, harassment, threats or defamation based on traits such as race, sex or religion. And all Internet companies could be held liable for knowingly promoting content that incites violence.
"This is a matter of life and death, and tech executives have a moral obligation to play an active role in banning online activities that incite violence and acts of domestic terrorism," O'Rourke spokeswoman Aleigha Cavalier said in an email.
Most experts believe the First Amendment allows private companies to block content on their platforms. But it's questionable whether the government can tell social media companies what speech they should block and what they should allow, said Jeff Kosseff, a cybersecurity law professor at the United States Naval Academy who wrote a book on legal protections in the digital age.
Kosseff said it's a legal issue that hasn't been tested in the courts, and the outcome would depend on the exact language of the law O'Rourke is proposing.
"There are certain types of speech that the government can regulate," such as imminent incitement of violence, or literal threats, he said. "But hate speech standing alone is really tricky."
Hate speech that isn't an imminent threat is still protected by the Constitution, noted Daphne Keller, a researcher at Stanford's Center for Internet and Society and a former associate general counsel for Google. "A law can't just ban it. And Congress can't just tell platforms to ban it, either -- that use of government power would still violate the First Amendment," she said.
Many of the mainstream social giants already have voluntarily set terms of service that seek to block white nationalist content.
A Section on 09/03/2019