Will Social Media Companies Ban the Great Replacement Theory?

The mass shooting in Buffalo, NY claimed 10 lives Saturday was an event shaped by and for internet platforms including message boards, streaming and social media sites.

Now, as the predominantly black neighborhood targeted by alleged killer Payton Gendron falters, it has become a matter of public safety whether these platforms allow their users to spread the racist “grand surrogate theory” that seems to have motivated him.

In the past, major social media companies have cited a clear connection to real-world violence as impetus for cracking down on certain categories of extremist speech. After Facebook had long allowed Holocaust denial under the banner of freedom of expression definitively banned such posts in 2020 in response to rising rates of anti-Semitic violence. It also forbidden the QAnon conspiracy movement for similar reasons, saying that even QAnon content that itself didn’t incite violence could still “be linked to various forms of real-world harm.”

In theory, the Buffalo massacre could mark a similar moment of truth for the grand replacement theory that claims whites are being “replaced” by non-white groups, and to which Gendron referred repeatedly in a 180-page manifesto posted online in front of the Spree.

But it’s not clear if things will actually turn out that way, given the political pressure on social media companies and the adoption of similar rhetoric by some of the right-wing’s most prominent figures.

Representatives from Twitch, Facebook, and Twitter did not immediately respond to a request for comment on what specific strategies or rules they use to moderate great replacement theory content. A YouTube spokesman did not immediately comment.

On most major social platforms, hate speech aimed at a specific group, as well as associated threats of violence, would typically already constitute a violation of the Terms of Service, said Courtney Radsch, a fellow at UCLA’s Institute for Technology, Law and Policy. What the Buffalo shooting could do, she said, is give tech companies some leeway to more aggressively enforce those rules.

“I think if you see a connection to violence in the real world, and a connection so direct, that gives more coverage for the crackdown,” Radsch said.

“However,” she said, “it’s going to be a very challenging situation because so much of this speech is being held by the far right; They have this cover from Tucker Carlson and Fox News.”

An analysis of the New York Times of 1,150 episodes of Carlson’s Fox show “Tucker Carlson Tonight” identified racial replacement fear-mongering as a continuous line, including more than 400 episodes in which Carlson claimed that Democrats (and some Republicans) are attempting to use immigration policies to America’s changing demographics.

While there is already a perception among some conservatives that social media companies are biased against right-wing content – a notion that research refutes – the crackdown on great posts related to replacement theory could land the platforms in politically tricky waters, said Raj. “That might make it harder for those platforms to take action.”

Wendy Via, co-founder and president of the Global Project Against Hate and Extremism, said that Carlson — and other ideologically-aligned politicians like JD Vance and Jim Jordan — often treat the powerful and well-connected with kid gloves on social media platforms. “Don’t let yourself be moderated like everyone else.”

“Great replacement content will spiral out of control because those who push it” get preferential treatment, Via said. “It can go through.”

It’s not a new problem.

Following the 2019 mass shooting in Christchurch, New Zealand targeted several mosquesFacebook “acted immediately” to strip major proponents of the replacement theory from a platform, including the group identity of the generationsaid Via. (On Facebook’s “dangerous people and organizations” list, which cannot be praised on the platform leaked last yearseveral European branches of Generation Identity were on it.)

But the problem, Via says, is that such efforts happen at a drip pace and play out unevenly across different social networks.

“It takes these big things to get them to act,” she said, but even then, “they don’t go from zero to 100. They go from zero to 20 … They have to go from zero to 100, not halfway, but it takes people dying to make them move [even] gradually.

“But I believe they will move gradually [now].”

Oren Segal, vice president of the Anti-Defamation League’s Center on Extremism, was even less optimistic.

“I’m trying not to be a pessimist, but if history is any indication, I don’t know how successful they will be or how much effort many of these companies will put in.” Segal said, adding that similar cycles of corporate reform have followed the Christchurch shooting and after the 2019 El Paso shooting targeting Latinos and the 2017 white nationalist Unite the Right rally in Charlottesville, Virginia. In both, the grand replacement theory played a central role.

“This is rinse and repeat,” Segal said. “Do the changes they make in response to tragedy ultimately have a lasting effect?”

Having influential figures like Carlson pushing the ideology behind this latest tragedy could, but shouldn’t, discourage platform companies from fighting its proliferation, Segal said.

“The fact that the ‘big surrogate’ is becoming ubiquitous not only on some extremist fringes but also in our public discussion,” he said, “suggests that there is more of a reason for them to take a stand [moderating it]not less.”

Leave a Reply

Your email address will not be published. Required fields are marked *