Refining its past approach toward undermining U.S. elections, Russia is trying to disrupt the current one by playing on divisive American narratives to gain traction with powerful influencers.
On Friday, top U.S. intelligence officials said Russians were behind the latest in a slew of faked propaganda videos, this one featuring purported Haitians boasting that they were voting multiple times in Georgia. The day-old video had been pushed on X by Amy Kremer, a member of the Republican National Committee and co-founder of Women for Trump.
The video played to a disinformation narrative championed by Donald Trump and other conservatives involving Haitian immigrants in Ohio and targeted it at voters in a nearly must-win state for the Republican nominee.
It was also more evidence of a marriage of convenience between Moscow’s narratives of U.S. dysfunction and Americans who agree or appreciate the enhanced audience engagement, and in some cases money, that Russia can arrange.
That alliance has made Moscow’s effort the most serious foreign influence operation of the 2024 campaign, officials said, one they expect to continue by casting doubt on the election’s integrity, encouraging protests and sowing disorder in the weeks after Nov. 5, especially if Democrat Kamala Harris wins.
In what may have been a warm-up exercise for Election Day, Russians produced a viral faked video portraying someone ripping up Trump ballots in Pennsylvania, the U.S. Office of the Director of National Intelligence said late last month, describing it as part of Russia’s effort “to undermine trust in the integrity of the election and divide Americans.”
Russia’s amplifying role in promoting the rumors and lies about Haitian immigrants eating pets in Springfield, Ohio, offers another example of how the strategy works. The bogus assertion was fading by early October, a month after Donald Trump aired it in the presidential debate, when influencer Tayler Hansen posted one more video with phony proof.
Hansen gained influencer status while working for Tennessee-based Tenet Media until early September, when the Justice Department alleged in a federal indictment that the outfit had received nearly $10 million from Russian state-owned media. Since Oct. 7, he racked up 8 million views on X alone for grainy footage he said showed a Black person “chopping an animal apart” with a machete behind a housing complex.
Tim Pool, a more widely followed influencer who had also worked for Tenet, reposted the video with his own comments on Facebook. The attention swelled even after the beleaguered Springfield police found the person who shot video, who told them he had filmed a resident chopping up a “big-ass ham,” a city spokesperson told The Washington Post. The X video post still has no notes appended warning about its untruthfulness.
Hansen and Pool say they didn’t know Russia was behind Tenet Media, which paid them and four others millions of dollars collectively before the scheme was exposed with the Sept. 4 indictment against two executives at RT, once called Russia Today. The bust, in which no Americans were charged, did nothing to prevent the influencers from continuing to advance Russia’s interests by fleshing out Trump falsehoods about Springfield and other prime subjects of misinformation. Pool and Hansen did not respond to Post questions.
“In 2016, they had to hide that this was Russian disinformation,” said Alex Stamos, a cybersecurity expert who as Facebook’s then chief security officer uncovered a web of fake Russian accounts on the social media site. “Now, the Department of Justice can present evidence that influencers were paid huge amounts and spread disinformation, and they still have a massive audience.”
The latest video that surfaced in Georgia bore the hallmarks of a Russia-linked troll farm aided by John Mark Dougan, a former Florida and Maine law enforcement officer who lives in Russia, said Darren Linvill, co-director of Clemson University’s Media Forensics Hub. The same group was previously blamed for an effort that falsely claimed to show a former student of Democratic vice-presidential nominee Tim Walz.
“There is a large audience right now for stories attacking the integrity of the election,” Linvill said. “Certain people want to hear this story because it reinforces their existing beliefs. It is also, perhaps, just the right mix of cynicism and xenophobia to be popular in certain corners of the internet.”
Gabriel Sterling, Georgia’s top elections official, condemned the dissemination of the video. “No responsible person would retweet this ridiculously obvious lie and disinformation,” he said.
Asked to comment on election influence, the Russian Embassy cited a previous statement by a spokesperson for the Ministry of Foreign Affairs: “We have never interfered, we are not interfering, and we do not intend to interfere in the future – unlike the United States, which cannot resist meddling in other countries’ internal affairs.”
As Tenet’s operations continue to reverberate, officials have suggested it was hardly the only Russia-backed operation. The Justice Department has offered rewards of up to $10 million for more information about executives of Russia-based Rybar LLC, which it has accused of running campaigns to stoke hate and influence the U.S. election under social media hashtags #StandWithTexas and #HoldtheLine.
Meanwhile, Russian trolls and automated accounts aided by artificial intelligence have amplified falsehoods about hidden hurricane death tolls and hammered on about the prospect of illegal voting by immigrants, which numerous studies have found to be a rare occurrence.
Tuesday’s election, like the deadly storms a huge event that unspools over time, offers a perfect stage for disinformation that claims new facts and confirms deep fears.
Iran has also improved its tactics, using fake news sites aimed at the left and right, encouraging street protests and hacking the Trump campaign, U.S. officials say. China is using more artificial intelligence and meddling in multiple congressional races involving its strong critics.
But Russia has developed the deepest understanding of America’s favorite online platforms and how to manipulate them, as the Tenet case suggests, grounded in part in common views. “There’s an overlap between some conservative influencers who want to paint a picture of a country in moral disarray and the same interest that a Russian troll might have,” said intelligence analyst Rennie Westcott, who tracks disinformation for Blackbird.AI.
RT used known right-wing influencers such as Pool and allegedly directed them to use their personal accounts to plug Tenet videos to millions of followers, according to the indictment, as well as funding relative unknowns like Hansen. The six influencers in Tenet’s stable had a collective billion video views in the year before the indictment was filed in September. All six still have active accounts on the major platforms.
Russians directed the editing and promotion of Tenet videos, the indictment alleges, emphasizing Ukraine as the villain, a line that benefits Russia no matter who wins on Tuesday. The Kremlin talking point that Ukraine is a corrupt drain on U.S. resources is shared by enough leading Republicans that Trump might be able to cut all aid if elected, Stamos said.
“They are spending tens of millions of dollars to help them win a war where they wasted billions,” he said, citing the war in Ukraine as a reason the Kremlin now has multiple agencies under its direct control devoted to pushing propaganda on Americans.
That is a big leap from 2016, when the Russians made a first major foray into fake social media accounts, mainly to undercut Democratic candidate Hillary Clinton, and “got lucky” when the GOP nomination and victory went to Trump, whom they favored in the election, said Emerson Brooking, strategy director at the Atlantic Council’s Digital Forensic Research Lab. Russians worked harder at social media disinformation four years later, fielding different teams for different messages and platforms and producing more credible fake documents and spokespeople.
During this year’s campaign, Brooking said, Russia’s focus is on “getting into the U.S. ecosystem and identifying sympathetic messengers.”
While Russia leverages “a wide range of influence actors,” a senior U.S. intelligence official said in a background briefing for reporters this month, it prefers homegrown advocates. “They use witting and unwitting Americans,” the official said, because “Americans are more likely to believe other Americans.”
Documents obtained by a European intelligence agency and shared with The Post showed that Dougan has become one of the Kremlin’s most effective propagandists, working directly with Russian military intelligence to pump out deepfakes targeting the Harris campaign and circulating misinformation, The Post reported last week. Dougan told The Post he had no ties to the country’s government.
Onetime Atlanta broadcast journalist Ben Swann has earned millions of dollars from RT’s parent company to produce shows, and his April video series “Zelenskyy Unmasked” was promoted by U.S. influencers including Donald Trump Jr., according to the Associated Press. Swann did not respond to a request for comment.
Internal Kremlin documents, exclusively reported by The Post earlier this year, showed high-ranking officials identifying core messages to spread: Ukrainian President Volodymyr Zelensky is corrupt; record numbers of immigrants are undermining America in fundamental ways; “white Americans” suffer because of foreign aid. Some of those pitches were in tune with ideas already being spread by the American right, and the documents said the Russians would do well to amplify the most viral and explosive of them.
The Justice Department also seized web domains last month that it said a Russian government contractor called the Social Design Agency used to post fake news stories mocked up to look like they were from The Post, Fox News and other outlets. An accompanying affidavit said the contractor tracked what 500 or more U.S. influencers were saying on social media, both to see how its campaigns were faring and for possible targeting through intermediary accounts.
Elon Musk, the owner of X and its most widely followed person, amplified Tenet influencer accounts repeatedly, reposting or replying more than 70 times in a year, most commonly to Pool.
“They were looking at the behavior of people like Musk and figuring out how to position messages to get in front of him, to get him to amplify,” Brooking said.
Catherine Belton and Samuel Oakford contributed to this report.
Send questions/comments to the editors.
We invite you to add your comments. We encourage a thoughtful exchange of ideas and information on this website. By joining the conversation, you are agreeing to our commenting policy and terms of use. More information is found on our FAQs. You can modify your screen name here.
Comments are managed by our staff during regular business hours Monday through Friday as well as limited hours on Saturday and Sunday. Comments held for moderation outside of those hours may take longer to approve.
Join the Conversation
Please sign into your Press Herald account to participate in conversations below. If you do not have an account, you can register or subscribe. Questions? Please see our FAQs.