After Jan. 6, Twitter Suspended 70,000 Right-Wing Accounts. Misinformation Dropped—Then Elon Musk Reversed It

Staff Writer By Staff Writer
Elon Musk. (Photo: Archive)

In the days following the January 6, 2021, insurrection, Twitter suspended around 70,000 accounts tied to the right-wing QAnon movement for spreading dangerous misinformation that contributed to real-world violence. A new study finds that this mass suspension had an immediate and widespread effect on limiting the spread of false information on the platform, which is now known as X after being bought by Elon Musk.

The study, published in Nature, suggests that if social media companies want to cut down on misinformation, banning repeat offenders—the users who habitually spread fake news—might be more effective than just removing individual posts.

- Advertisement -

The suspension of those 70,000 QAnon-linked accounts didn’t just reduce the misinformation from those who were banned. It had a “spillover effect,” meaning that the spread of fake news dropped across the platform as a whole, including among users who hadn’t been banned but had followed those accounts. Some misinformation spreaders also voluntarily left the platform after the purge.

While social media moderation has become controversial in some circles, especially at X, where Musk has reinstated several previously banned accounts, including former President Donald Trump’s, this study shows that limiting misinformation is possible if platforms make it a priority—something that could be especially important with the 2024 election looming.

“There was a spillover effect,” said Kevin M. Esterling, a professor at the University of California at Riverside and a co-author of the study. “It wasn’t just a reduction from the deplatformed users themselves, but it reduced circulation on the platform as a whole.”

- Advertisement -

Along with the QAnon purge, Twitter also famously suspended Donald Trump on January 8, 2021, citing the risk of inciting further violence. Other platforms, like Facebook and YouTube, followed suit. While Trump’s suspension may have helped reduce misinformation on its own, the study’s findings show that even if Trump’s account is removed from the equation, the overall effect of banning QAnon-linked accounts still holds up, according to co-author David Lazer, a professor at Northeastern University.

The study, which analyzed data from around 500,000 active Twitter users at the time, focused on about 44,000 users who had shared links to low-credibility websites, including sources like Gateway Pundit, Breitbart, and Judicial Watch. The results showed that users who followed accounts involved in the QAnon movement were less likely to share such links after the deplatforming compared to those who didn’t follow them.

Since Musk’s takeover, X has rolled out a “Community Notes” feature that allows users to fact-check posts instead of removing content or banning accounts outright. Musk has argued that this is a better way to handle misinformation, preferring to limit the reach of posts rather than taking them down.

- Advertisement -

The issue, as Esterling explains, “community notes are like putting your finger in a dike”—by the time one post is flagged, it may have already been seen by millions.

“I’m not advocating deplatforming, but it does have potential efficacy,” Lazer added. “Identifying people who are repeated sharers of misinformation is much easier than going after individual pieces of content.”

Anika Collier Navaroli, a former senior policy official at Twitter, said that the findings back up the argument she made to Twitter’s leadership at the time. She pointed out that the company had already identified the QAnon accounts before January 6, but only after the violence at the Capitol did the company take action. “We already knew who they were,” she said. “People just needed to die for the harm to be seen as real.”

The study makes it clear that banning habitual spreaders of misinformation can significantly reduce the spread of fake news on social media.

- Advertisement -
TAGGED:
Share This Article