Generative AI is being misused more often, and a new report shows how it’s playing a role in state-backed influence campaigns. The latest target appears to be Europe, where AI-generated voiceovers were used in fake news videos to sway public opinion on Ukraine.
According to Massachusetts-based threat intelligence company Recorded Future, commercial AI voice tech, including ElevenLabs’ products, was “very likely” involved in a Russian-led operation.
The campaign, dubbed “Operation Undercut,” aimed to weaken European support for Ukraine. Fake videos accused Ukrainian politicians of corruption and spread misleading narratives about military aid. One video even claimed that “American Abrams tanks” were useless against jammers, pushing the idea that sending high-tech tanks to Ukraine was pointless.
Recorded Future researchers submitted the videos to ElevenLabs’ AI Speech Classifier, which confirmed the use of its technology to generate the voiceovers. ElevenLabs, which was founded in 2022, did not respond to requests for comment. The videos were multilingual, available in European languages like English, French, and Polish, with AI-generated voices speaking without any foreign accents, unlike human voiceovers with detectable Russian accents.
The operation was linked to Russia’s Social Design Agency, a group that U.S. authorities sanctioned earlier this year for creating fake news websites. Although the campaign’s impact was minimal, it showcased the powerful role of AI in influencing public opinion.
This isn’t the first time ElevenLabs has faced scrutiny. The company’s tech was also behind a robocall impersonating President Joe Biden during the 2024 U.S. elections. In response, ElevenLabs introduced new safety features to prevent misuse.
ElevenLabs has quickly become a major player in the AI space, with rapid revenue growth and the potential to reach a $3 billion valuation. But its growing influence comes with increasing responsibility.