Alex Jones defamation trials show the limits of platform switching for a select few – KPBS | Jewelry Dukan

A new defamation trial against conspiracy theorist Alex Jones, which began this week, may provide a glimpse into the effectiveness of “deplatforming” – booting unwanted accounts from social media sites.

This Connecticut trial is the second of three trials Jones faces for spreading lies on his streaming TV show and Infowars website that the 2012 Sandy Hook elementary school shooting was a hoax. The victims’ families, whom Jones called “crisis actors,” faced harassment, threats and psychological abuse. In August, a Texas jury awarded the family members $45.2 million in damages, although Jones says he intends to appeal the decision.

Jones, a serial conspirator and fabulist, was sacked from almost every major internet and social media platform in 2018 after threatening then-Special Counsel Robert Mueller, who was investigating then-President Donald Trump’s ties to Russia. First, a flurry of media coverage touted flagging traffic on Jones’ websites as proof that “deplatforming works.” However, the revelations from the Jones defamation trials may indicate the existence of a rare class of extreme internet personalities who are better protected from efforts to curb the reach of their content.

In the Connecticut trial, a company representative from Jones’ company testified that Infowars may have generated between $100 million and $1 billion in revenue in the years since the Sandy Hook massacre. Testifying during the previous trial in Texas, Jones told the court that Infowars generated approximately $70 million in revenue last fiscal year, up from an estimated $53 million in 2018, the year Infowars largely deflated became.

The difference between Jones and many other right-wing actors who have been de-platformed, says political scientist Rebekah Tromble, who directs George Washington University’s Institute for Data, Democracy & Politics, “is that Infowars is an existing infrastructure outside of social media would have. ”

Infowars makes about 80% of its revenue from selling products, mostly dietary supplements, according to court filings from the largest of Jones’ nine private companies. He grew his talk radio audience, aided by an early partnership with a sympathetic distributor, and now owns his own network and independent video streaming site.

A growing body of research suggests that deplatforming toxic actors or online communities usually reduces viewership significantly, with the caveat that this smaller audience migrates to less regulated platforms where extremism, along with the potential for violence, is then concentrated .

Assessing the effectiveness of deplatforming is complicated, in part because the word itself can mean different things, says Megan Squire, a computer scientist who analyzes extremist online communities for the Southern Poverty Law Center.

“Your site infrastructure is gone, your social media, your banking. Like the big three, I’d say,” Squire says. She says they all had different effects depending on the case.

Squire research shows that traffic to Jones’ online Infowars shop remained stable for about a year and a half after it was removed from major social media sites. It then declined throughout 2020, until in the run-up to that year’s presidential election and its violent aftermath, when Infowars Store traffic saw a massive surge, reaching levels not seen by Jones in two years prior to his deplatforming.

Jones’ resilience is the exception rather than the rule, Squire says. She points to the case of Andrew Anglin, founder of the neo-Nazi website The Daily Stormer. After the violent 2017 Unite the Right rally in Charlottesville, Virginia, he lost his web domain and had to traverse 14 more, losing traffic each time. Squire says Anglin is on the run from various court cases, including a ruling that he owes $14 million in damages for terrorizing a Jewish woman and her family.

Post-Deplatforming Survival Strategies

Even after social media bans, conspiracy theorists like Jones find workarounds. According to Squire, it’s common for other users to host the banned personality on their own channels or simply repost the banned person’s content. People can rebrand or direct their audience to an alternative platform. After bans from companies like YouTube and PayPal, white supremacist live streamer Nick Fuentes eventually built his own video streaming service, which he encouraged his audience to use kill lawmakers in the run-up to the January 6 Capitol riot.

Other internet communities have shown similar resilience. A popular pro-Trump news forum called TheDonald was banned from Reddit and later shut down by a later owner after the Capitol riot and is now more active than ever, according to Squire. When Trump himself got banned from Twitter, Squire watched as a messaging app Telegram won Tens of thousands of new users. It remains a thriving online space for right-wing celebrities and hate groups.

As for fundraising, even if completely cut off from financial institutions that process credit cards or donations, extremists can always resort to cryptocurrency.

“100% of these guys are into crypto,” says Squire, which she notes doesn’t exactly make a living off of. Its value is volatile and not always easy to redeem. Still, Squire and her colleagues have found anonymous donors using crypto to funnel millions of dollars to Jones and Fuentes.

“We live in a capitalist society. And who says entrepreneurs can’t side with the conspiracy, too?” says Robert Goldberg, a history professor at the University of Utah. He points out that conspiracy theory peddlers have always been “incredibly adept” at whatever new technology was available to them.

“The Atlanta, Georgia Klan headquarters sold hoods and robes and all this merchandise, this brand, this jewelry, if you will, to the 5-6 million people who joined the Ku Klux Klan in the 1920’s.” he says. But aside from the KKK’s heyday, Goldberg says, selling covert material about the Kennedy assassination, UFOs, or the 9/11 terrorist attacks in general has been far less lucrative until now.

power and lies

A bigger question for University of North Carolina Center for Information, Technology and Public Life researcher Shannon McGregor is what conspiracy entrepreneurs hope to achieve with their outreach.

“Why are these people even doing this? What’s in it for you? And in many cases, especially in this country, this moment is about staying in power,” says McGregor. Fringe communities always exist in democracies, she says, but what should be worrying is their proximity to power.

She rejects a “both sides” account of the issue, calling it a decades-old right-wing phenomenon. “Because at least this right-wing, ultra-conservative media ecosystem was geared towards political power like it was in the Nixon era, it’s much less likely to actually go away,” says McGregor.

Deplatforming and criminal lawsuits for defamation, she argues, are less of a solution than “damage reduction.” When an individual conspirator or conspiracy site loses its audience, replacements quickly spring up. None of this, McGregor and other experts agree, however, means that efforts to stem the spread of extremist or anti-democratic narratives should be abandoned altogether.

“I think overall [social media company] Reps would prefer the conversation to be, “Oh, well, deplatforming doesn’t work, does it? … You know, that’s not our responsibility anymore,’” Tromble says.

Squire says there’s no doubt that anything that makes it difficult for toxic conspiracy theorists to operate smoothly or get their message across is worth doing. It makes the platform they are removed from safer and reinforces the social norm that harassment and hate speech have consequences.

Copyright 2022 NPR. To see more, visit

Leave a Comment