“Biden calls Trump ‘threat to the nation,’” posted Sputnik International, a Russian state media site, sharing a video of a recent Biden speech to more than 400,000 followers. “Trump gets shot the very next day … Coincidence?”
The wave of sensational posts painted the United States as a nation in decline and on the verge of civil war. Russian state media boosted accounts saying that the United States had devolved into a third-world country. Chinese state media shared cartoons labeling America a “violence exporter.” And Iranian accounts spread false claims that the gunman was affiliated with antifa, a loosely knit group of far-left activists that Trump and Republicans have previously blamed for violence.
The frenzied post-shooting news cycle was a gift to adversaries who have spent years developing a digital strategy to leverage crises for political gain. The lack of immediate information about the gunman, stark images of a bloodied former president in broad daylight and rampant homegrown conspiracy theories created an ideal environment for influence operations to exploit.
“Any domestic crisis can and will be picked up and exacerbated by state actors, who will try to turn it to their own ends,” said Reneé DiResta, former research manager at the Stanford Internet Observatory and author of “Invisible Rulers: The People Who Turn Lies Into Reality.”
Foreign adversaries pounced on the opportunity to portray the United States as “a violent and unstable actor — at home and around the world,” said Graham Brookie, the Atlantic Council’s vice president of technology programs and strategy.
While some state accounts publicly stoked these narratives on X, researchers also observed activities in more private channels, with Brookie remarking Sunday that Kremlin proxies across the messaging service Telegram were “having a day.”
GET CAUGHT UP
Stories to keep you informed
Russia has used state-controlled media to promote negative stories about the United States for decades, a method that accelerated with the growth of English-language outlets and social media. After the invasion of Ukraine, however, some platforms blocked or labeled RT and Sputnik.
In response, Russia has put more work into generating unlabeled propaganda, including regular and “verified” blue-check accounts on X, influencers on Telegram and other platforms, and communications through unaffiliated media. The deniability makes messages more credible, regardless of overlaps with content published to state-funded media.
X did not immediately respond to a request for comment.
The widespread impact of online foreign influence in American elections was first felt in 2016, when Russia used social media to target conservatives with scare messages about immigrants, minorities and crime, while also posing as Black activists angry at police violence. Since then, China has adopted some of the same techniques, according to researchers and intelligence officials.
In April, Microsoft reported that Beijing was using fake accounts to push questions on controversial topics including drug abuse, immigration and racial tensions. The accounts — which posed as American voters — sometimes probed followers about their support for U.S. presidential candidates.
“We know that Russia has historically taken these events as an opportunity to spread conspiracy theories, and we assume they are still running operations that include impersonating Americans,” longtime information researcher and University of Washington professor Kate Starbird said Tuesday.
The spike in posts related to the shooting comes as foreign interference operations are exploding and becoming more difficult to track. A variety of foreign actors are engaging in the campaigns, while advances in artificial intelligence have made it easier for even small actors to translate their messages into English, craft sophisticated images and make bogus social media accounts seem genuine.
Russian and Chinese accounts have proliferated on X, posting on such hot-button political issues as the decay of American cities and the immigration crisis at the Texas border. Earlier this year, propaganda accounts promoting Chinese views multiplied in the run-up to Taiwan’s elections. And last week, U.S. and allied officials identified nearly 1,000 fake accounts on X that used artificial intelligence to spread pro-Russian propaganda.
Since Saturday’s shooting, Russian diplomatic accounts have been amplifying critical statements from Kremlin spokespeople on X and other social media, said Melanie Smith, a U.S. research director at the Institute for Strategic Dialogue. Chinese state media outlets have taken a more neutral tone, focusing on allegations that Secret Service failures led to the violence, she said.
The Global Times, a Chinese state media outlet, shared a cartoon early Sunday depicting a hammer labeled “political violence” falling on a map of the United States. “Looking to the future, if the US is unable to change the current situation of political polarization, political violence is likely to intensify,” the account tweeted.
#Opinion: Looking to the future, if the US is unable to change the current situation of political polarization, political violence is likely to intensify, further exacerbating the vicious cycle between those two phenomena. https://t.co/nveRG1rkIx
— Global Times (@globaltimesnews) July 15, 2024
Some foreign actors have brazenly accused their enemies of somehow orchestrating the attack on Trump. For example, Russian-affiliated accounts on X suggested without evidence that Ukraine or the U.S. defense industry may have been involved to prevent Trump from cutting off aid to the region and withdrawing lucrative military contracts.
“Trump may have become an obstacle to the arms industry with his ‘America First’ program,” one post in German read. “The industrial and military lobbies have always had very long arms.”
“Trump’s coming to power means the collapse of the arms race,” one in French said. “… So you can look for someone who benefits.”
The accounts are tracked by Antibot4Navalny, a Russian activist research group.
In an interview on the Russian state TV channel Soloviev Live that was promoted on Telegram, U.S. journalist John Varoli said, “Ukrainian special services might be behind this, on the orders of the White House,” according to a translation by anti-misinformation company NewsGuard.
Varoli further suggested without evidence that the suspected gunman was affiliated with antifa, as did Iranian state media. As of Wednesday, the FBI had been unable to establish a motive; investigators said Thomas Matthew Crooks, a 20-year-old nursing-home employee from suburban Pittsburgh, appeared to have acted alone.
Over the past two years, social media platforms have scaled back work against foreign misinformation and curtailed communication with the U.S. government about it. The FBI recently resumed some communications with the companies, The Post previously reported. The contacts resumed shortly before the U.S. Supreme Court threw out a challenge from conservatives, who sought to ban such contacts as impermissible government interference in protected free speech.
Platforms such as Meta have teams that identify and respond to covert foreign influence operations. But the company, along with X and YouTube, has weakened or eliminated policies and programs meant to fight political misinformation and limited access to tools that helped independent researchers root out such networks.
“I’m worried that we’ve lost a little bit of those windows into that activity due to changes in recent years,” Starbird said.
Meta did not immediately respond to a request for comment.
Those teams, which typically ramp up in the months immediately before an election, may not be prepared for a crisis such as the assassination attempt so early in the political cycle, said Brian Fishman, who previously led Facebook’s work against dangerous individuals and organizations and co-founded the trust and safety company Cinder.
“The danger here,” Fishman said, “is that the threat to our political process isn’t just coming on Election Day.”
Naomi Nix contributed to this report.