
In August 2024, during the #EndBadGovernance protests in Nigeria, a group of minors was arrested for waving Russian flags. The incident quickly caught fire online. Photos and videos of the young protesters spread across platforms, causing debates about patriotism, foreign influence, and state overreach. Some saw the Russian flags as a symbol of protest against Western-backed leadership or a means to strive for change from democracy to a military government. Nonetheless, one thing was clear: the internet gave the moment new life and layers of meaning.
This protest is a clear example of how important civic actions, especially those involving the youth, can be amplified online, reshaped by competing narratives, and used to drive vastly different public opinions. Even before official reports are released, social media often sets the tone for how such events are understood, debated, and remembered.
In 2016, the US presidential election and Brexit referendum, both influenced by coordinated disinformation and foreign interference, made the role of fact-checking gain global recognition. Countries began adopting fact-checking models not just to counter false narratives, but to safeguard democratic processes, public health, and national security.
This global shift was driven by the growing awareness that no nation is immune to the destabilising effects of manipulated information, especially when foreign actors deploy it.
What once appeared a challenge confined within national borders has since evolved into something far more complex. As new dynamics continue to emerge, disinformation is becoming increasingly sophisticated, shaped largely by an evolving phenomenon known as Foreign Information Manipulation and Interference (FIMI).
What is FIMI?
Foreign Information Manipulation and Interference (FIMI) refers to deliberate efforts by foreign actors, often state-affiliated or state-backed, to influence public opinion, disrupt democratic processes, or undermine societal trust using deceptive or manipulative tactics.
Unlike traditional propaganda, FIMI campaigns are often covert, adaptive, and tailored to exploit existing societal divisions. They rely on disinformation, forgeries, deep fakes, fake grassroots movements, and coordinated amplification across platforms to create confusion, break down trust in institutions, and manipulate narratives in target countries.
What sets FIMI apart is not just the message’s content but also the strategic intent behind it. These campaigns are designed to interfere with internal affairs without direct confrontation, often using social media, messaging apps, and even local influencers to spread and legitimise misleading narratives.
A notable example occurred during the COVID-19 pandemic, when foreign-linked networks spread conspiracies in Africa claiming vaccines were a Western tool for population control. These narratives, often disguised as local voices, were strategically shared to deepen mistrust in public health systems and Western institutions. Such campaigns didn’t just circulate misinformation; they deliberately exploited historical grievances to weaken global cooperation and institutional trust.
The growing recognition of FIMI as a unique and escalating threat has now led to increased monitoring, reporting, and collaborative countermeasures, especially during elections or geopolitical tensions.
FIMI in the Nigerian context
FIMI is increasingly shaping Nigeria’s digital space and is posing serious risks to public trust, national cohesion, and democratic processes. Though often subtle, these campaigns are strategic, sustained, and deliberately designed to exploit societal divisions and weaken institutional credibility.
A recent DUBAWA investigation uncovered a coordinated network of X (formerly Twitter) accounts pushing pro-Russian propaganda and anti-democratic narratives in Nigeria. These accounts, many of which had been active since 2022, were found propagating messages that questioned the value of democracy and praised authoritarian governance, particularly in the context of Russia’s growing influence across Africa.
While they disguised themselves as organic Nigerian voices, their messaging patterns, timing, and alignment with broader geopolitical events pointed to foreign orchestration. This type of manipulation isn’t just about opinion. It seeks to distort public discourse ahead of elections and normalise authoritarianism in a country that is still strengthening its democratic institutions.
Another investigation revealed how Russian-linked disinformation networks exploited social media platforms like Telegram and TikTok to influence protests in northern Nigeria, resulting in the unusual display of Russian flags during #EndBadGovernance demonstrations.
A Telegram channel known as “Africa Initiative,” linked to pro-Russian propaganda efforts, pushed narratives framing the protests as anti-Western uprisings, falsely suggesting external manipulation by Western powers. These narratives were made to flourish by paid local influencers, including popular TikTokers @Sani Aliyu638, nasi1germany, directorkano, danib714, danmakaranta62, sadeeqaminu90, hajjims_fashion_empire, who helped to legitimise and spread the messaging. The coordinated campaign, a clear example of FIMI, not only misrepresented the intent of the protests but eventually contributed to the arrest and incarceration of minors.
Other documented cases of FIMI in Nigeria further reveal how foreign actors and their local collaborators exploit community-level tensions and political uncertainty to advance their agendas. Amplified through social media, this symbolic gesture pointed to coordinated disinformation efforts designed to align local frustrations with foreign geopolitical interests.
Similarly, in the lead-up to the Imo State governorship election, disinformation actors circulated a doctored image of a soldier, falsely claimed to be from Saudi Arabia, to spark panic and unrest. The image was, in fact, of U.S.-trained forces in Iraq. This false narrative, pushed by figures like Simon Ekpa, was part of a broader FIMI effort to incite violence and sow distrust in electoral processes using manipulated visuals and nationalist rhetoric.
These are not isolated incidents; they show a growing pattern of foreign actors using Nigerian social media spaces as laboratories for influence operations. They spread anti-Western, anti-democracy, or ethno-religiously charged narratives by cloaking their origin and intent, often using fake local identities or hijacked hashtags.
The gaps and what needs to be done
Despite growing awareness of disinformation, Nigeria remains dangerously underprepared for the scale and sophistication of FIMI threats. Existing counter-efforts, though commendable, are largely fragmented, underfunded, and reactive. While fact-checking organisations like DUBAWA have uncovered and documented several coordinated campaigns, these revelations rarely lead to institutional action or policy change.
There is no formal framework for detecting, tracking, or responding to foreign interference campaigns. Security agencies and digital regulators operate in silos, and there’s a lack of coordinated intelligence-sharing or response mechanisms.
Many Nigerians, especially outside urban areas, lack the digital literacy needed to spot subtle manipulations, and influencers, knowingly or not, often become amplifiers of these foreign narratives.
More troubling is that there is no accountability system for local actors who serve as inroads for foreign propaganda. Whether through financial incentives or ideological alignment, these collaborators are rarely confronted or regulated, leaving the digital space open to sustained foreign influence.
To counter FIMI effectively, Nigeria must move from reactive detection to systemic prevention. The country needs a national strategy focused on information integrity that explicitly addresses foreign influence. This strategy should strengthen coordination among intelligence units, digital regulators, civil society, and media actors.
Efforts must also go beyond elite circles. Media literacy programs tailored for local communities, schools, and faith-based institutions are key to helping citizens spot and resist manipulative narratives.
Social media platforms like TikTok, Telegram, and X must also be held accountable for enabling foreign actors to exploit their systems.
Conclusion
As unfamiliar as the concept of FIMI may seem, it is steadily gaining traction and becoming entangled in the web of online misinformation campaigns. It serves as a perfect tool for malign actors, using paid or ideologically aligned influencers to promote false narratives and cast doubt on democracy, without directly pointing fingers at those orchestrating it. As this topic flourishes, it has become important for fact-checkers to ramp up efforts to raise awareness about this evolving threat and work harder to debunk its deceptive claims.