Germany’s 2025 election is under attack. More than 1,000 fake accounts are manipulating political discourse, inflating support for the far-right AfD, and distorting public perception. Cyabra’s latest analysis uncovers a sophisticated disinformation operation designed to shape the outcome of the upcoming vote.
Disinformation and coordinated bot campaigns have long been a feature of election cycles, and this one is no exception. However, it also represents a significant evolution in the tactics of state actors seeking to influence voters, drive online discourse toward extremism and polarization, and erode trust in public institutions.
Here’s what Cyabra uncovered:
TL;DR?
- Disinformation at Scale: 1,000+ fake profiles have been actively engaging in election discourse, promoting the far-right party AfD while discrediting the Greens, and Chancellor Olaf Scholz
- Artifical Support: Fake profiles engaged heavily in amplifying posts by AfD’s co-chairwoman, Alice Weidel. On a key viral post, 33% of interactions were fake, creating an illusion of mass support.
- Long-term Influence Campaign: Nearly half (47%) of the fake accounts uncovered have been active for over a year, indicating a well-orchestrated, long-running disinformation effort to manipulate German voters.
Who Do the Bots Vote For?
During January and February, Cyabra monitored social media discourse related to Germany’s elections, analyzing hashtags, keywords, communities, and the authenticity of the profiles participating in online conversations.
Cyabra’s analysis detected over 1,000 fake profiles artificially boosting support for the far-right party AfD (Alternative for Germany), infiltrating authentic social media conversations to spread hundreds of misleading posts, attack political opponents, and amplify pro-AfD narratives.
Fake pro-AfD narratives were particularly pushed in conversations around the three major political parties (AfD, SPD, and Greens), where fake profiles created the illusion of widespread AfD support while drowning out real political debate.
The Intricate Tactics of Election Interference
Cyabra’s research uncovered that 47% of the fake profiles have been active for over a year, suggesting a well-orchestrated, long-term influence operation campaign designed to manipulate German public perception. The rest of the fake profiles, created in the months preceding the elections, show the gradual rise of manipulation efforts as voting day draws near.
In the picture: the “age” of fake profiles in German election discourse (based on creation date)
Across X, fake accounts were focused on three separate disinformation campaigns:
- Alice Weidel and AfD – the co-chairwoman of AfD, Alice Weidel, had a high presence of bots interacting with her content, pushing positive and supportive messaging. 23% of her engagements originated from fake profiles. A trending post by Weidel, with a potential reach of 126 million views, was flooded with fake engagement – a third of all interactions (33%) were fake.
- The Greens – 15% of accounts discussing Germany’s Green Party were fake. The fake profiles promoted negative, anti-Green narratives, and amplified support for AfD.
- Olaf Scholz and SPD – in conversations related to Germany’s chancellor and his party SPD (Social Democratic Party), 14% of the profiles were fake. An analysis of recent posts by Sholtz uncovered an even bigger 22% fake profiles interacting with his content, amplifying criticism while again, pushing support toward AfD.
While the three disinformation campaigns acted separately and with different networks of profiles, they all had the same goal: to promote support for AfD and discreetly target AfD’s opponents.
The narratives used by fake profiles morphed, adapting seamlessly to the conversations they were participating in: In Weidel’s trending posts, fake profiles praised her, pushing the message of Weidel and AfD providing hope for a better future. With the Green Party, fake profiles attacked the party’s slogan, ‘Brandmauer’ (German for ‘firewall’, which stands against far-right extremism and the AfD), claiming that the Greens’ policies would destroy Germany’s future. The fake comments on chancellor Scholz’s posts were discrediting his leadership and SPD party, amplifying criticism. All three networks of bots frequently mentioned AfD.
In the picture: Weidel’s posts, the network of fake profiles that integrated into authentic engagements (Green: real profiles. Red: fake profiles), and examples of fake profiles supporting Weidel in the comment section.
How Can We Protect the Integrity of Election Discourse?
The narratives used by fake profiles in Germany’s election discussions followed classic disinformation tactics, commonly seen in past European elections: framing certain policies as threats and pushing similar messages across multiple discussions – both to promote their candidate and to discredit the opposition.
However, the tactics used by fake profiles this time were particularly sophisticated: By latching onto trending leaders’ posts, blending into authentic conversations, and coordinating positive and negative disinformation efforts, bots created the illusion of widespread support for their candidate – while simultaneously manufacturing massive criticism of their opponents. In doing so, they amplified polarization and fueled fear and anxiety about the future.
“Cyabra’s findings are a wake-up call: social media is being weaponized to manipulate the German election. The scale and coordination of these disinformation campaigns reveal a deliberate effort to shape public perception, sway undecided voters, and push a specific political agenda. With AI-driven disinformation manufacturing political narratives at scale, Germany’s election integrity is far from the last to be at major risk. Democracies and government organizations must act now to combat the growing threat that disinformation poses to election integrity.
Learn more about Cyabra’s OSINT capabilities in monitoring election discourse, uncovering fake campaigns, and identifying bot networks.
Want to see how Cyabra uncovers fake campaigns in real time? Download the full report.