Cyabra Launches Deepfake Detection

The Coordinated Shadow Campaign Targeting Norway’s Election

As Norway approaches its 2025 parliamentary elections, taking place on September 8, Cyabra uncovered a coordinated disinformation campaign manipulating public opinion, systematically undermining Prime Minister Jonas Gahr Støre, and attempting to influence not only voters, but also worldwide public opinion.

A Threat to Democracy Unfolds

While Norway was preparing for parliamentary elections, Cyabra analyzed online conversations and uncovered a sophisticated network of fake accounts manipulating public discourse and influencing voters’ opinions.

Cyabra’s discovery followed heightened concerns about foreign interference in Norway’s democratic processes: in May 2025, Norway’s domestic intelligence agency PST issued warnings about potential foreign meddling in the upcoming elections, with PST chief in Finnmark Johan Roaldsnes noting that Russian operatives have previously exploited election issues to create discord and spread false information.

Cyabra’s investigation, conducted between June 8 and August 3, 2025, revealed that out of all the profiles active in Norway’s political conversations, 57% focused specifically on PM Støre. More alarmingly, 34% of these profiles were identified as fake – a clear indication of coordinated inauthentic activity targeting the Prime Minister during this critical pre-election period.

The Numbers Behind Norway’s Coordinated Manipulation

Cyabra identified hundreds of fake profiles actively generating anti-Støre content within Norway’s digital sphere. These accounts produced hundreds of posts, shares and replies that reached hundreds of thousands of potential views and triggered hundreds of direct engagements.

At several key points in time, fake profiles actually surpassed authentic users in activity levels, driving the majority of content during peak engagement periods. This strategic amplification gave inauthentic voices outsized influence in shaping public perception of Prime Minister Støre.

On peak days, fake profiles (red) gained the upper hand on authentic accounts (green), driving the online discourse

Fake accounts deployed character defamation, often referring to Støre as a “clown,” “traitor,” or “weak leader” and using harsh language. Some replies included AI-generated offensive images and deepfakes showing Støre in humiliating contexts, designed to provoke ridicule and disgust.

A fake AI-generated image of PM Støre designed to ridicule him

These profiles also spread accusations targeting Støre’s foreign and domestic policies, claiming he prioritized Ukraine and U.S. interests over Norwegian citizens while ignoring key domestic issues like the cost of living and social welfare systems.

Coordinated Tactics: Manufacturing a Crisis of Confidence

Cyabra’s investigation revealed how fake profiles systematically infiltrated authentic conversations and engaged with real users to mimic genuine public opinion. By embedding themselves in organic threads, they amplified anti-government narratives, increased the perceived credibility of their messages, and blurred the line between coordinated disinformation and legitimate civic discourse.

These fake accounts were highly active on Jonas Gahr Støre’s official profile, using replies and comment threads to insert their narratives into high-visibility conversations, thereby boosting reach, legitimacy, and influence across broader audiences.

On the left: fake profiles successfully integrating into authentic conversations and communities. On the right: fake and authentic profiles that engaged with PM Støre’s official profile

This pattern aligns with what researchers have identified as evolving digital interference tactics. “Science Norway” recently reported that Russia, along with China and Iran, actively engages in election interference through disinformation, fake accounts, and manipulated news sources.

Global Reach, Local Impact

Many fake profiles claimed to be based in locations across Europe, North America, Asia, and Africa – creating an illusion of global consensus against Støre. This geographic dispersion obscured what may be a coordinated domestic effort originating from within Norway while creating the impression of widespread international criticism of Støre’s leadership.

Most importantly, this tactic successfully drew real users from various countries into the conversation, who became active participants in the discourse – thereby boosting the spread and credibility of anti-Støre narratives on a global scale.

The claimed locations of fake profiles (red) – and the authentic profiles (green) that were drawn into participating in the discourse following the fake campaign

Democratic Integrity at Stake

This coordinated campaign poses a direct threat to Norway’s democratic processes by potentially influencing voter behavior before the September 8 elections. The Norwegian government has implemented various measures against disinformation, including strengthening the Norwegian Media Authority’s responsibilities and opting for manual vote counting to mitigate risks.

However, these measures primarily address content-based disinformation rather than the behavioral patterns of inauthentic accounts. Traditional content moderation proves ineffective against these sophisticated operations because the coordinated bot network uses legitimate-appearing accounts to infiltrate authentic conversations.

Protecting the Authenticity of Election Discourse

The uncovered influence operation in Norway proves more than ever that government agencies need continuous monitoring focused on behavioral patterns, bot networks and authenticity analysis, rather than simple content moderation. Behavioral analysis can identify coordinated campaigns before they significantly impact public discourse.

Protecting democratic processes requires continuous monitoring of the online realm, of key political figures and manipulations targeting them, behavioral analysis to identify coordinated inauthentic activity, public awareness and education about how fake profiles operate, and international cooperation to address cross-border influence operations.

As Norway approaches this critical election, the shadow campaign targeting Prime Minister Støre represents an attack on the democratic process itself, one that governments can no longer disregard.

Read Cyabra’s full report uncovering influence operations in the Norwegian election

Read more:

Related posts

Fake Profiles Grill Burger King Online

Cyabra’s latest deep dive into #BoycottBurgerKing reveals that nearly 40% of the profiles pushing the negativity are fake! The online boycott was massively amplified by...

Humanoid robots at computer stations generating negative emoji reactions toward an oversized cheeseburger labeled with hashtag BoycottBurgerKing

Rotem Baruchin

August 29, 2024

10 Ways to Spot a Fake Social Media Profile

With so many accounts on social media, recognizing a fake profile can be a daunting task, especially for people who are not tech-savvy. It is...

Rotem Baruchin

August 8, 2024

Tracking Anti-Navalny Bot Armies

This story was written by Evan Solomon and was first posted on Gzero, based on Cyabra's research. _____ In an exclusive investigation into online disinformation...

Magnifying glass highlighting a hooded anonymous figure beside a black-and-white photo of Alexei Navalny, symbolizing an investigation into coordinated anti-Navalny bot activity

Rotem Baruchin

February 28, 2024