ADVERTISEMENT

The far right is using AI to sell Hitler to a new generation

cigaretteman

HB King
May 29, 2001
78,706
61,020
113
Extremists are using artificial intelligence to reanimate Adolf Hitler online for a new generation, recasting the Nazi German leader who orchestrated the Holocaust as a “misunderstood” figure whose antisemitic and anti-immigrant messages are freshly resonant in politics today.

In audio and video clips that have reached millions of viewers over the past month on TikTok, X, Instagram and YouTube, the führer’s AI-cloned voice quavers and crescendos as he delivers English-language versions of some of his most notorious addresses, including his 1939 Reichstag speech predicting the end of Jewish people in Europe. Some seeking to spread the practice of making Hitler videos have hosted online trainings.

The posts, which make use of cheap and popular AI voice-cloning tools, have drawn praise in comments on X and TikTok, such as “I miss you uncle A,” “He was a hero,” and “Maybe he is NOT the villain.” On Telegram and the “dark web,” extremists brag that the AI-manipulated speeches offer an engaging and effortless way to repackage Hitler’s ideas to radicalize young people.



“This type of content is disseminating redpills at lightning speed to massive audiences,” the American Futurist, a website identifying as fascist, posted on its public Telegram channel on Sept. 17, using a phrase that describes dramatically reshaping someone’s worldview. “In terms of propaganda it’s unmatched.”
💻
Follow Technology
The propaganda — documented in videos, chat forum messages and screen recordings of neo-Nazi activity shared exclusively with The Washington Post by the nonprofit Institute for Strategic Dialogue and the SITE Intelligence Group — is helping to fuel a resurgence in online interest in Hitler on the American right, experts say. In a report published Friday, ISD researchers found that content glorifying, excusing or translating Hitler’s speeches into English has racked up some 25 million views across X, TikTok and Instagram since Aug. 13.
The videos are gaining traction as former president Donald Trump and his Republican running mate, Sen. JD Vance of Ohio, have advanced conspiracy theories popular among online neo-Nazi communities, including baseless claims that Haitian immigrants in Ohio are eating pets.



Experts say the latest generation of AI tools, which can conjure lifelike pictures, voices and videos in seconds, allow fringe groups to breathe fresh life into abhorred ideologies, presenting opportunities for radicalization — and moderation challenges for social media companies.
“It enables a new kind of emotional engagement that may be much more seductive to a new generation,” said Emerson T. Brooking, resident senior fellow at the Atlantic Council’s Digital Forensic Research Lab.
One user hosted a livestream on the video-sharing site Odysee last year teaching people to use an AI voice cloning tool from ElevenLabs and video software to make Hitler videos. In roughly five minutes, he created an AI voice clone of Hitler appearing to deliver a speech in English, railing about Jews profiting from a capitalist system.



The user, who uses the handle OMGITSFLOOD and is identified as a “prominent neo-Nazi content creator” by the SITE Intelligence Group, which tracks white supremacist and jihadist activity online, said on the livestream that Hitler is “one of the best f — king leaders that ever lived.” The user added that he hoped to inspire a future leader like Hitler out there who may be “voting for Trump” but “just hasn’t been pilled.”
Artemis Seaford, the vice president of AI safety at ElevenLabs, said the company prohibits the use of its tools to create violent, hateful or harassing content. After The Post flagged the livestream video to ElevenLabs, Seaford said the company banned the user from its platform. (ElevenLabs’ AI voice tool was used by a Democratic operative earlier this year to spoof President Joe Biden’s voice.)
Odysee and OMGITSFLOOD did not return requests for comment.


 
Creating the video required only a few-second sample of Hitler’s speech taken from YouTube. Without AI, the spoofing would have demanded advanced programming capabilities. Some misinformation and hate speech experts say that the ease of AI is turbocharging the spread of antisemitic content online.
“Now it’s so much easier to pump this stuff out,” said Abbie Richards, a misinformation researcher at the left-leaning nonprofit watchdog Media Matters for America. “The more that you’re posting, the more likely the chances you have for this to reach way more eyes than it ever would.”
Neo-Nazis are always looking for ways to “force” their narratives into mainstream discourse, said Rita Katz, the executive director of SITE Intelligence Group, who has detailed how extremists use AI in a report. AI helps by creating messages that capture people’s attention, she added.



“These disguised Hitler AI videos ... grab users with a bit of curiosity and then get them to listen to a genocidal monster,” Katz said.
On TikTok, X and Instagram, the AI-generated speeches of Hitler don’t often bear hallmarks of Nazi propaganda. A video posted on TikTok in September depicted a silhouette of a man who seemed to resemble Hitler, with the words: “Just listen.”
Over a slow instrumental beat, an AI-generated voice of Hitler speaks English in his hallmark cadence, reciting excerpts of his 1942 speech commemorating the Beer Hall Putsch, a failed Nazi coup in 1923 that vaulted Hitler to prominence. The video, which is no longer online, got more than 1 million views and 120,000 likes, according to Media Matters for America.

The AI Hitler speeches can be effective as tools of radicalization even when they aren’t accompanied by explicit pro-Hitler commentary, said Isabelle Frances-Wright, director of technology and society at ISD. “There’s a big difference between reading a German translation of Hitler speeches versus hearing him say it in a very emotive way in English,” she said.


Frances-Wright compared them with videos that went viral on TikTok last year in which content creators read excerpts of Osama bin Laden’s “Letter to America” manifesto, drawing replies from young Americans such as, “OMG, were we the baddies?”
On TikTok, users can easily share and build on the videos using the app’s “duet” features, which allow people to post the original video alongside video of themselves reacting to it, Richards said. Because the videos contain no overt terrorist or extremist logos, they are “extremely difficult” for tech companies to police, Katz added.

Tracy Clayton, a Meta spokesperson, said the company removed the Instagram videos after being alerted by The Post because they violated the company’s content policies. Nick Smith, a TikTok spokesperson, said it removed all the videos The Post flagged to the company because they violated TikTok’s community guidelines. Jack Malon, a spokesperson at YouTube, said the site’s community guidelines “prohibit content that glorifies hateful ideologies such as Nazism, and we removed content flagged to us by The Washington Post.”


ISD’s report noted that pro-Hitler content in its dataset reached the largest audiences on X, where it was also most likely to be recommended via the site’s algorithm. X did not return a request for comment.
The number of active neo-Nazi groups in America has declined since 2017, according to annual reports by the nonprofit Southern Poverty Law Center, partly as a result of crackdowns by law enforcement following that year’s deadly “Unite the Right” rally in Charlottesville.

But that doesn’t mean Nazism is on the decline, said Hannah Gais, a senior research analyst at the center. Right-wing extremists are turning to online forums, rather than official groups, to organize and generate content, using mainstream social media platforms to reach a wider audience and recruit new adherents.
While it’s impossible to quantify the real-world impact of far-right online propaganda, Gais said, you can see evidence of its influence when prominent figures such as conservative pundit Tucker Carlson, billionaire Elon Musk and Trump adviser Stephen Miller espouse elements of the antisemitic “great replacement” conspiracy theory, or when mass shooters in Buffalo, El Paso and Christchurch, New Zealand, cite it as inspiration.


Those high-profile news events, in turn, reinvigorate online interest in far-right ideologies. ISD’s report found that posts glorifying or defending Hitler surged on X this month after Carlson posted an interview with Holocaust revisionist Darryl Cooper, which Musk reposted and called “worth watching.” (Musk later deleted his post.)
Carlson, Musk and Miller did not return requests for comment.
Earlier this year, the pro-Trump conspiracy theorist Dominick McGee posted to X an English-language AI audio recreation of Hitler’s 1939 Reichstag speech, which garnered 13,000 retweets, 56,000 likes and more than 10 million views, according to X’s metrics. (Experts note that view counts on X may be inflated compared with those on other sites.)
“This AI translation gives viewers a look into the mind of Hitler,” wrote McGee, whose X account Musk controversially reinstated last year after he was suspended for posting a video of child sex abuse. While McGee’s post didn’t endorse Hitler’s views, many of the replies indicated approval, while others requested translations of the speech into French, Arabic and Bengali.
Jared Holt, a senior researcher at ISD who co-wrote a separate recent report on the rise of white nationalism on TikTok, said extremists are often among the first groups to exploit emerging technologies, which often allow them to maneuver barriers blocking such materials on established platforms.
Holt said you can’t assume that everyone who watches or shares an AI Hitler video or meme agrees with it — some might just find it funny or edgy.
“But in the broader scheme of politics, it can have a desensitizing or normalizing effect if people are encountering this content over and over again,” he said.
 
ADVERTISEMENT

Latest posts

ADVERTISEMENT