Don't ban collusion conspiracy theorists

In the wake of special counsel Robert Mueller's report, there's a new meme on the right: Social media companies should ban the "conspiracy theorists" who constantly repeated that President Donald Trump's electoral victory was tainted by collusion with Russia. The argument isn't particularly persuasive, but it does point to a genuine problem.

Here's syndicated columnist Adriana Cohen: "For starters, Twitter, Facebook, Google and other Silicon Valley tech companies should remove all Russian collusion conspiracy theorists from their platforms." After all, the argument runs, Alex Jones and his Infowars was deplatformed. Why not (asks Cohen) treat those who spread the collusion story the same way?

"And what about Google?" she continues. "Will it continue to allow search results that yield now-debunked conspiracy theories surrounding Russian collusion and the Trump campaign? Or will they do the right thing and scrub misinformation and lies to stop the hoax from perpetuating?"

As George Will is wont to say: Well.

Cohen's is rather a clever notion, trying to hoist the left with its own admittedly shaky petard. Banning would be no small punishment, because the evidence suggests that deplatforming works--that is, those who are barred from social media, whatever their previous prominence, rapidly lose audience and influence.

My view is that social media companies shouldn't be in the business of banning anybody, but I don't happen to run one so it's not my call. So let's take Cohen seriously. Because even if she being tongue-in-cheek, trying to make a point about online discrimination against right-of-center viewpoints, her suggestion deserves a reflective response. Three responses, in fact.

1.Let's get the history right. Facebook, which got the ball rolling, didn't actually ban Alex Jones for spreading fake news. He was banned instead for violating the company's vague rules governing hate speech. Whatever one might say about the Russia-collusion narrative, it can't be squeezed into this particular pigeonhole.

I am skeptical that a category called hate speech can or should be defined and singled out for special treatment. But the various social media companies that have adopted policies limiting it are privately owned and can do what they want. That's their right under the First Amendment. (I know that many of my friends on the left insist that corporations don't have free speech rights, but I emphatically disagree.)

2.One way to differentiate the Alex Joneses of the world from the many media figures who spread the Russia-collusion hypothesis is that Jones seems to have manufactured theories from whole cloth, while those who pushed the collusion tale sincerely believed it. Many doubtless believe it still. The distinction, then, would rest on whether those who propound or repeat conspiracy theories have reasonable grounds for their beliefs.

My friends on the right proclaim that many commentators have been so desperate to "get" Trump that they persuade themselves, on the basis of thin evidence, of whatever truths make him look bad. But even if this assertion is true (and I'm not endorsing it), the Russia-collusion theory cannot seriously be compared to (to take one example) Jones' claim that the Sandy Hook school shooting was actually a secret operation organized by the U.S. government.

I admit that this would be a dreadful rule. It's difficult to imagine the social media companies, even with their thick bureaucracies of content reviewers, fairly enforcing so subtle a discrimination. This is true today particularly, when people of differing politics can hardly join argument because of sharp disagreements over which grounds are reasonable.

But we're not discussing here what the best rule should be; my point is only that the Russia-collusion hypothesis is different from lots of other conspiracy theories that get people banned.

2.In any case, criticism of a political candidate or a sitting president, even when arising from rumor and speculation, arguably must stand on a different footing than speech of other kinds. Here it is useful to be a bit Hayekian. Friedrich Hayek's case for free speech rested crucially on the notion that public debate constitutes a check on the government; allowing the government to regulate speech thus allows the government to control the debate about itself.

Yes, this thesis is controversial, but the question whether political speech is entitled to a higher level of constitutional protection than speech of other kinds we can leave for another day. For the moment, suffice to say that a social media company might reasonably adopt Hayek's position.

Having said this, I should make clear that I adhere strongly to my stated position that social media companies should not be trying to discriminate between true and false news. And not only because we don't really know all that much about how false claims on social media affect people's views. The larger reason is that as the de facto gatekeepers of public debate, social media companies have an ethical responsibility to bend over background to be neutral toward content. Any effort to do otherwise will inevitably be abused by partisan bias.

I certainly would not want the government to adopt any sort of legislation enforcing that responsibility--for example, by regulating the platforms as public utilities. But I'm still quite certain that the responsibility is there.

Stephen L. Carter is a professor of law at Yale University and was a clerk to U.S. Supreme Court Justice Thurgood Marshall.

Editorial on 03/31/2019

Upcoming Events