Generative AI is being misused more often, and a new report shows how itโs playing a role in state-backed influence campaigns. The latest target appears to be Europe, where AI-generated voiceovers were used in fake news videos to sway public opinion on Ukraine.
According to Massachusetts-based threat intelligence company Recorded Future, commercial AI voice tech, including ElevenLabs’ products, was โvery likelyโ involved in a Russian-led operation.
The campaign, dubbed โOperation Undercut,โ aimed to weaken European support for Ukraine. Fake videos accused Ukrainian politicians of corruption and spread misleading narratives about military aid. One video even claimed that โAmerican Abrams tanksโ were useless against jammers, pushing the idea that sending high-tech tanks to Ukraine was pointless.
Recorded Future researchers submitted the videos to ElevenLabsโ AI Speech Classifier, which confirmed the use of its technology to generate the voiceovers. ElevenLabs, which was founded in 2022, did not respond to requests for comment. The videos were multilingual, available in European languages like English, French, and Polish, with AI-generated voices speaking without any foreign accents, unlike human voiceovers with detectable Russian accents.
The operation was linked to Russiaโs Social Design Agency, a group that U.S. authorities sanctioned earlier this year for creating fake news websites. Although the campaignโs impact was minimal, it showcased the powerful role of AI in influencing public opinion.
This isnโt the first time ElevenLabs has faced scrutiny. The companyโs tech was also behind a robocall impersonating President Joe Biden during the 2024 U.S. elections. In response, ElevenLabs introduced new safety features to prevent misuse.
ElevenLabs has quickly become a major player in the AI space, with rapid revenue growth and the potential to reach a $3 billion valuation. But its growing influence comes with increasing responsibility.