Nepal’s Protests: Authentic Voices, Fake Calls for Violence

The September 2025 Nepal protests were sparked by the government’s sudden ban on popular social media platforms. Although the ban was quickly lifted, demonstrations had already erupted. Over the course of several days, the protests resulted in 72 deaths and 2,113 injuries among both citizens and law enforcement and ultimately led to Prime Minister K. P. Sharma Oli’s resignation.

For those following Nepali discourse in recent years, the movement was not unexpected; it reflected long-standing frustrations, with accusations of corruption, nepotism, and political favoritism fueling discontent. These grievances resonated strongly with Gen Z, who were at the forefront of the demonstrations and organized much of the movement online.

Beneath the surface, however, the protests revealed a more complex digital battlefield. Authentic voices demanding change were interwoven with coordinated inauthentic activity that disproportionately amplified violent narratives – a stark reminder of the massive impact disinformation has on public discourse, and how easily coordinated campaigns can escalate violence.

Here’s what Cyabra uncovered:

The Rise of an Online Protest

Calls for protest started spreading after the Nepali government declared a ban on social media platforms, supposedly to combat fake accounts, disinformation, and misinformation, which was not received well by Nepali citizens. On September 6–7, social media saw a surge of posts urging people to join the demonstrations, using hashtags such as #September8, #WakeUpNepal, and #GenZProtest. Many of these promotional posts featured AI-generated visuals of protesters, amplifying their appeal and reach. After September 8, when police opened fire on demonstrators near Parliament, leaving dozens dead in the first major clashes, the conversation shifted from mobilization to violence documentation, with users sharing images, videos, and testimonies of clashes.

AI generated images were widely used by authentic profiles, as well as viral hashtags.

Cyabra analyzed social discourse related to the protest across X, Facebook, and TikTok, and uncovered a network of fake profiles strategically positioned within authentic conversations to amplify violent messaging.

Unlike crude bot operations, these profiles, which represented 34% of the conversation on X, blended seamlessly with authentic voices, using identical hashtags and engaging with real profiles to maximize influence and perceived legitimacy.

Authentic Narratives – And Inauthentic Profiles

Cyabra identified three dominant narratives that shaped the protest discourse:

  • Youth-led democratic struggle: Posts framed young Nepalis as the resistance driving force, positioning Gen Z at democracy’s forefront. Content emphasized the moral contrast between youth (authentic and fearless) and the ruling elite (out of touch and repressive).
  • Police brutality and state repression: Discourse focused on security forces firing into crowds, deploying tear gas, and violent clashes. First-hand testimonies, media reports, and viral visuals portrayed the state responding to civic dissent with disproportionate force.
  • Political leadership crisis: Content highlighting government instability and demands for Prime Minister Oli’s resignation gained significant traction. Viral messages linked Oli and Parliament to eroding institutional trust, reinforcing perception that new leadership was necessary.
A post by an authentic profile, driving the narrative of a youth-led democratic struggle.

These narratives merged and reinforced each other, creating a digital ecosystem of dissent that reflected and amplified street protests.

This was where fake profiles entered the discourse. Picking and choosing some of the most aggressive and violent narratives in the conversation, thousands of inauthentic accounts gained over 164,000 interactions from authentic profiles, and reached over 326 million potential views, causing a disproportionate impact.

What Makes Fake Profiles “Good Activists”

Fake profiles employed several tactics to maximize influence, exposure and manipulation:

  • Strategic hashtag amplification: Inauthentic accounts used the same hashtags as authentic users (#NepalProtest, #GenZProtest, #SocialMediaBan, #EnoughIsEnough) to amplify reach.
  • Authentic engagement simulation: Fake profiles engaged with real users, commenting on authentic posts to appear organic while systematically amplifying violent narratives.
  • Promotional visual content: Posts encouraging participation featured AI-generated visuals of protesters, enhancing appeal and reach across platforms.
  • Cross-platform coordination: While X showed the highest concentration of fake profiles at 34%, coordinated activity spanned TikTok and Facebook as well, creating multi-platform amplification.
  • Selective amplification of violent discourse: Fake profiles consistently elevated some of the most radical and incendiary messages, ensuring that extreme narratives gained disproportionate visibility in the overall protest conversation.

These evolving tactics make coordinated inauthentic behavior increasingly complex to detect – which is exactly why fake profiles can exert such an outsized impact on public discourse.

Fake profiles (red) successfully infiltrated authentic conversations (green) to promote and push violent narratives.

Digital Activism vs. Manipulation

The Nepal protests illustrate social media’s dual nature in modern political movements. Platforms empowered authentic voices and facilitated unprecedented mobilization, contributing to a movement that ultimately caused Prime Minister Oli to resign. Simultaneously, coordinated inauthentic behavior (34% of profiles) demonstrated how vulnerable these conversations are to manipulation and radicalization.

 As young activists now help shape Nepal’s new government, the digital battleground that fueled their rise continues to evolve. Detecting inauthentic behavior remains essential to understanding the true nature of online political discourse.

Download Cyabra’s full report on the Nepal protests

Related posts

Tracking Disinformation Following Biden’s Announcement

Cyabra analyzed online discourse following Biden’s announcement, and uncovered two fake campaigns attacking Harris and Biden and pushing pro-Trump content, and third campaign that latched...

Humanoid robots working at computers in a dimly lit analytics room, examining floating social media posts that display 18+ content warnings alongside portraits of various political figures, symbolizing AI detection of online disinformation

Rotem Baruchin

August 6, 2024

Misinformation Monthly – December 2024

Each month, our experts at Cyabra list some of the interesting articles, items, essays, and stories they’ve read this month. Come back every month for...

Purple-toned illustration of floating document icons above a glowing smartphone, overlaid with the words “Misinformation Monthly” and “Cyabra Reading List”

Rotem Baruchin

December 26, 2024

Frost & Sullivan Recognizes Cyabra for Technology Innovation

Frost & Sullivan recently analyzed the social media intelligence (SOCMINT) market and, based on its findings, recognizes Cyabra with the 2025 North American Technology Innovation...

Cyabra 2025 Technology Innovation Leader award graphic from Frost & Sullivan, highlighting best practices in North American social media intelligence solutions

Rotem Baruchin

April 23, 2025