Elon Musk’s Machine for Fascism: A One-Stop Shop for Disinformation and Violence
Just over a year ago, I described how Twitter had been used as a way to sow false claims in support of Trump in 2016 and 2020.
I described how, in 2016, trolls professionalized their efforts, with the early contribution of Daily Stormer webmaster Andrew “Weev” Auernheimer. I quoted testimony from Microchip, a key cooperating co-conspirator at Douglass Mackey’s trial, describing how he took unoffensive content stolen from John Podesta and turned it into a controversy that would underming Hillary Clinton’s chances.
Q What was it about Podesta’s emails that you were sharing?
A That’s a good question.
So Podesta ‘s emails didn’t, in my opinion, have anything in particularly weird or strange about them, but my talent is to make things weird and strange so that there is a controversy. So I would take those emails and spin off other stories about the emails for the sole purpose of disparaging Hillary Clinton.
T[y]ing John Podesta to those emails, coming up with stories that had nothing to do with the emails but, you know, maybe had something to do with conspiracies of the day, and then his reputation would bleed over to Hillary Clinton, and then, because he was working for a campaign, Hillary Clinton would be disparaged.
Q So you’re essentially creating the appearance of some controversy or conspiracy associated with his emails and sharing that far and wide.
A That’s right.
Q Did you believe that what you were tweeting was true?
A No, and I didn’t care.
Q Did you fact-check any of it?
A No.
Q And so what was the ultimate purpose of that? What was your goal?
A To cause as much chaos as possible so that that would bleed over to Hillary Clinton and diminish her chance of winning.
After Trump won, the trolls turned immediately to replicating their efforts.
Microchip — a key part of professionalizing this effort — declared, “We are making history,” before he immediately started pitching the idea of flipping a European election (as far right trolls attempted with Emmanuel Macron’s race in 2017) and winning the 2020 election.
They did replicate the effort. That same post described how, in 2020, Trump’s role in the bullshit disinformation was overt.
Trump, his sons, and his top influencers were all among a list of the twenty most efficient disseminators of false claims about the election compiled by the Election Integrity Project after the fact.
While some of the false claims Trump and his supporters were throttled in real time, almost none of them were taken down.
But the effort to throttle generally ended after the election, and Stop the Steal groups on Facebook proliferated in advance of January 6.
To this day, I’m not sure what would have happened had not the social media companies shut down Donald Trump.
And then, shortly thereafter, the idea was born for the richest man in the world to buy Twitter. Even his early discussions focused on eliminating the kind of moderation that served as a break in 2020. During that process, someone suspected of being Stephen Miller started pitching Elon Musk on how to bring back the far right, including “the boss,” understood to be Trump.
Musk started dumping money into Miller’s xeno- and transphobic political efforts.
Once Musk did take over Xitter, NGOs run by far right operatives, Republicans in Congress, and useful idiots coordinated to undercut any kind of systematic moderation.
As I laid out last year, the end result seemed to leave us with the professionalization and reach of 2020 but without the moderation. Allies of Donald Trump made a concerted effort to ensure there was little to hold back a flood of false claims undermining democracy.
Meanwhile, the far right, including Elon, started using the Nazi bar that Elon cultivated to stoke right wing violence here on my side of the pond, first with targeted Irish anti-migrant actions, then with the riots that started in Southport. I’ve been tracing those efforts for some time, but Rolling Stone put a new report on it out, yesterday.
Throughout, the main forum where right-wing pundits and influencers stoked public anger was X. But a key driver of the unrest was the platform’s owner himself, Elon Musk. He would link the riots to mass immigration, at one point posting that “civil war” in the U.K. was inevitable. He trolled the newly elected British prime minister, Keir Starmer — whose Labour Party won power in July after 14 years of Conservative rule — for supposedly being biased against right-wing “protesters.” After Nigel Farage, the leader of radical-right party Reform U.K. and Trump ally, posted on X that, “Keir Starmer poses the biggest threat to free speech we’ve seen in our history,” Musk replied: “True.”
Anything Musk even slightly interacted with during the days of violence received a huge boost, due to the way he has reportedly tinkered with X’s algorithm and thanks to his 200 million followers, the largest following on X. “He’s the curator-in-chief — he’s the man with the Midas touch,” says Marc Owen Jones, an expert on far-right disinformation and associate professor at Northwestern University in Qatar. “He boosted accounts that were contributing to the narratives of disinformation and anti-Muslim hate speech that were fueling these riots.”
Elon Musk, the richest man in the world, one of Trump’s most gleeful supporters, someone with troubling links to both China and Russia, has set up a one-stop shop: Joining false claims about the election with networks of fascists who’ll take to the streets.
With that in mind, I want to point to a number of reports on how disinformation has run rampant on Xitter.
The Center for Countering Digital Hate (one of the groups that Elon unsuccessfully sued) released a report showing that even where volunteers mark disinformation on Xitter, those Community Notes often never get shown to users.
Despite a dedicated group of X users producing accurate, well-sourced notes, a significant portion never reaches public view. In this report we found that 74% of accurate Community Notes on false or misleading claims about US elections never get shown to users. This allows misleading posts about voter fraud, election integrity, and political candidates to spread and be viewed millions of times. Posts without Community Notes promoting false narratives about US politics have garnered billions of views, outpacing the reach of their fact-checked counterparts by 13 times.
NBC described Elon’s personal role in magnifying false claims.
In three instances in the last month, Musk’s posts highlighting election misinformation have been viewed over 200 times more than fact-checking posts correcting those claims that have been published on X by government officials or accounts.
Musk frequently boosts false claims about voting in the U.S., and rarely, if ever, offers corrections when caught sharing them. False claims he has posted this month routinely receive tens of millions of views, by X’s metrics, while rebuttals from election officials usually receive only tens or hundreds of thousands.
Musk, who declared his full-throated support for Donald Trump’s presidential campaign in July, is facing at least 11 lawsuits and regulatory battles under the Biden administration related to his various companies.
And CNN described how efforts from election administrators to counter this flood of disinformation have been overwhelmed.
Elon Musk’s misinformation megaphone has created a “huge problem” for election officials in key battleground states who told CNN they’re struggling to combat the wave of falsehoods coming from the tech billionaire and spreading wildly on his X platform.
Election officials in pivotal battleground states including Pennsylvania, Michigan and Arizona have all tried – and largely failed – to fact-check Musk in real time. At least one has tried passing along personal notes asking he stop spreading baseless claims likely to mislead voters.
“I’ve had my friends hand-deliver stuff to him,” said Stephen Richer, a top election official in Arizona’s Maricopa County, a Republican who has faced violent threats for saying the 2020 election was secure.
“We’ve pulled out more stops than most people have available to try to put accurate information in front of (Musk),” Richer added. “It has been unsuccessful.”
Ever since former President Donald Trump and his allies trumpeted bogus claims of election fraud to try to overturn his loss to Joe Biden in 2020, debunking election misinformation has become akin to a second full-time job for election officials, alongside administering actual elections. But Musk – with his ownership of the X platform, prominent backing of Trump and penchant for spreading false claims – has presented a unique challenge.
“The bottom line is it’s really disappointing that someone with as many resources and as big of a platform as he clearly has would use those resources and allow that platform to be misused to spread misinformation,” Michigan Secretary of State Jocelyn Benson told CNN, “when he could help us restore and ensure people can have rightly placed faith in our election outcomes, whatever they may be.”
Finally, Wired explained how, last week, Elon’s PAC made it worse, by setting up a group of 50,000 people stoking conspiracy theories.
For months, billionaire and X owner Elon Musk has used his platform to share election conspiracy theories that could undermine faith in the outcome of the 2024 election. Last week, the political action committee (PAC) Musk backs took it a step further, launching a group on X called the Election Integrity Community. The group has nearly 50,000 members and says that it is meant to be a place where users can “share potential incidents of voter fraud or irregularities you see while voting in the 2024 election.”
In practice, it is a cesspool of election conspiracy theories, alleging everything from unauthorized immigrants voting to misspelled candidate names on ballots. “It’s just an election denier jamboree,” says Paul Barrett, deputy director of the Center for Business and Human Rights at New York University, who authored a recent report on how social media facilitates political violence.
[snip]
Inside the group, multiple accounts shared a viral video of a person ripping up ballots, allegedly from Bucks County, Pennsylvania, which US intelligence agencies have said is fake. Another account shared a video from conspiracy theorist Alex Jones alleging that unauthorized immigrants were being bussed to polling locations to vote. One video shared multiple times, and also purportedly from Buck County, shows a voter confronting a woman with a “voter protection” tag on a lanyard who tells the woman filming that she is there for “early vote monitoring” and asks not to be recorded. Text in the accompanying post says that there were “long lines and early cut offs” and alleges election interference. That post has been viewed more than 1 million times.
Some accounts merely retweet local news stories, or right-wing influencers like Lara Loomer and Jack Posobiec, rather than sharing their own personal experiences.
One account merely reshared a post from Sidney Powell, the disgraced lawyer who attempted to help Trump overturn the 2020 election, in which she says that voting machines in Wisconsin connect to the internet, and therefore could be tampered with. In actuality, voting machines are difficult to hack. Many of the accounts reference issues in swing states like Pennsylvania, Michigan, and Wisconsin.
This latter network includes all the same elements we saw behind the riots in the UK — Alex Jones, Trump’s fascist trolls, Russian spies (except Tommy Robinson, who just got jailed on contempt charges).
Now, in my piece last year, I suggested that Elon has diminished the effectiveness of this machine for fascism by driving so many people off of it.
The one thing that may save us is that this Machine for Fascism has destroyed Xitter’s core value to aspiring fascists: it has destroyed Xitter’s role as a public square, from which normal people might find valuable news. In the process, Elmo has destroyed Twitter’s key role in bridging from the far right to mainstream readers.
Maybe that’s true? Or maybe by driving off so many journalists Elon has only ensured that journalists have to go look to find this stuff — and to be utterly clear, this kind of journalism is some of the most important work being done right now.
But with successful tests runs stoking far right violence in Ireland and the UK, that may not matter. Effectively, Elon has made Xitter a massive version of Gab, a one-stop shop from which he can both sow disinformation and stoke violence.
On a near daily basis, DOJ issues warnings that some of this — not the false claims about fraud and not much of the violent rhetoric, but definitely those who try to confuse voters about how or when to vote — is illegal.
NBC describes that election officials are keeping records of the corrections they’ve issued, which would be useful in case of legal cases later. What we don’t know is whether DOJ is issuing notices of illegal speech to Xitter (they certainly did in 2020 — it’s one of the things Matt Taibbi wildly misrepresented), and if so, what they’re doing about it.
I am, as I have been for some time, gravely alarmed by all this. The US has far fewer protections against this kind of incitement than the UK or the EU. Much of this is not illegal.
Kamala Harris does have — and is using — one important tool against this. Her campaign has made a record number of contacts directly with voters. She is, effectively, sidestepping this wash of disinformation by using her massive network of volunteers to speak directly to people.
If that works, if Harris can continue to do what she seems to be doing in key swing states (though maybe not Nevada): getting more of her voters to the polls, then all this will come to a head in the aftermath, as I suspect other things may come to a head in the transition period, assuming Harris can win this thing. In a period when DOJ can and might act, the big question is whether American democracy can take action to shut down a machine that has been fine-tuned for years for this moment.
American law and years of effort to privilege Nazi speech have created the opportunity to build a machine for fascism. And I really don’t know how it’ll work out.
Update: Thus far there have been three known Russian disinformation attempts: a false claim of sexual abuse targeted at Tim Walz, a fake video showing votes in Bucks County being destroyed, and now a false claim that a Haitian migrant was voting illegally in Georgia.
This statement from Raffensperger, publicly asking Musk to take it down, may provide some kind of legal basis to take further steps. That’s the kind of thing that is needed to get this under control.
This in a way overlaps Ed’s post of “Do you trust the media?” X(Twitter) is media, and was biased under Dorsey to where a (past?) contributor to EW got removed for not talking pretty. Lack of political correctness. Dorsey now is farting around with crypto.
He was not the gold standard. Musk is different, but the pendulum has swung, at its fastest point passing equilibrium.
“I am, as I have been for some time, gravely alarmed by all this. The US has far fewer protections against this kind of incitement than the UK or the EU. Much of this is not illegal.” Ding ding ding! Thank you for this post.
TFG is Musk’s useful idiot, not the other way around, imo.
I don’t think either of them are useful. Definitely idiots, however!
They are both useful idiots from Putin’s POV IMHO.
The problem is the cadence mismatch: possibly illegal overt actions appearing 4-6 months prior to election day; prosecutorial responses appearing 1-2 years after election day. MAGA strategy (possibly tutored by KGB-trained Putin, and translated into action by Cohn-tutored Trump) has effectively exploited this mismatch.
How to deal with this issue in a democracy with free speech protections?
The President could declare Xitter to be a threat to the nation. Is it too late now? Maybe. All we can do is hope that Harris can prevail, and I think this hope is reasonable.
Why not shut it down? There are plenty of outlets for free speech. There is no constitutional right to broadcast on Xitter or Facebook. Yes, there will be lawsuits and pushback. Maybe it’s a dice roll with the Supreme Court.
But the main point is that it changes the cadence mismatch. Now MAGA has to prove they have a right to Xitter. Xitter has to scramble to stay in business. Negotiations will follow. Maybe the FTC will step in. Maybe something like UK/Germany protocols will result.
This could be done at any time.
There are about 100 things wrong, legally, with your argument, starting with “This could be done at any time.” If you want to fight fascism, you need to talk about what is real, not what you think should be real.
I also think there are steps in place to address what you call a cadence mismatch.
I’m beginning to think someone could start a boutique law firm specializing in various kinds of aggressive anti-defamation work, with a side-order of injury claims against those who are currently pushing the idea of civil war with every tweet/post they make, and enjoy some very nice profits.
This assumes our country survives the next elections, of course. I hope someone is quietly bringing the National Guard to full readiness and making sure that local law enforcement has been brought into line with the current, federal anti-coup policies (whatever those might be, if they exist… *Sighs*)
It would be a firm like Marc Elias becoming the go-to lawyer for election fraud claims.
As for the National Guard, the local commanders can do all the prep work they want to get to full readiness, and local law enforcement can do the same, but until their governor (or in DC’s case, the president/Sec Def) activates the NG, all they can do is stand by and wait. See Jan 6 . . .
“…side-order of injury claims…” thousands of potential paper cuts
The firm that won Ruby Freeman Rudy Giuliani’s apartment is doing much of that. They first started trying with Seth Rich’s family, but unsuccessfully.
Scary, very scary. How to combat the richest man in the world holding his thumb on the scale for Trump?
I did ask ChatGPT-4o for some answers, but my link was moderated away, so I suggest that others can ask the same questions.
(I use ChatGPT-4o in my work, which occasionally requires solving technical and informational problems that I have never solved before, so I have developed a good relationship with it.)
Here are the questions that I asked:
“How much actual influence does X (twitter) have on elections?”
“How can misinformation on X be mitigated?”
“What if the owner of X wants to use misinformation to affect the outcome of the presidential election? Can he be stopped?”
ChatGPT did provide answers to these questions, some very interesting.
I had two weeks in Glacier NP this summer and I stopped using the platform in mid July but left my account up. I had decided that I wasn’t changing any minds and it lost its appeal as a news aggregator. I have since put my efforts into a more local focus and have used the information gleaned from this site to call into Wyoming’s right wing radio to point out to the host how he was manipulated into reporting on stories based on disinformation from foreign adversaries. Much of his audience brays for the host to hang up or not have me on, but he is adamant that hearing what the other side is thinking is useful to counter liberal’s arguments.
I am torn about this approach as some argue that getting the right worked up gets them out to vote, but I have continued to argue that disengaging from one group of my local neighbors since the rise of Limbaugh and Fox News has been a failure of the left.
Due to the checks and balances incorporated into our governance, I realize that I have to lose my Wyoming Democrat Senator, Representative, County Attorney and Sheriff in order for things to get really dicey in Laramie. If those things do occur over the next 2 years then all the protections to hold the christian nationalist state legislature at bay will dissolve.
So what is the answer? Engage on Twitter, leave it dormant or some other solution?
I might be the last person to think of this, but inviting the people who are already communicating with other voters directly to continue communicating after the election/January 6/January 20 seems like the most obvious thing in the world.
We beg “people who have a platform” to say things – why not keep a progressive platform in place? It’s not about winning presidential elections, or even about elections. It’s about making actual progress, every day.
I think Biden finally figured out that the media were not helpful – that’s why he froze them out when he announced he was dropping out of the race. One hopes Harris has absorbed that lesson!
I wonder what can be done on the state level, considering the level of importance that the actions of individual states will have in the event of a Trump victory. Could Vermont, for example, ban Twitter within its borders much in the way it geofences its ban on internet gambling?