|
Getting your Trinity Audio player ready...
|
By Phillip Anjorin
The use of Artificial Intelligence (AI) surged in 2025, generating sophisticated and misleading claims across Nigeria and the international media space. From fake prison footage of Nnamdi Kanu and fabricated diplomatic clashes between Tinubu and Trump, to scams promising financial aid, deception was rife.
DUBAWA conducted a series of crucial fact-checks across various political, security, and economic issues. These investigations revealed the extent of AI-fueled fabrication and the need for media literacy in the news and information we are daily exposed to on social media.
- Videos showing Nnamdi Kanu in prison
Nnamdi Kanu, the IPOB leader, was recently convicted and sentenced to life imprisonment. After his transfer to a Sokoto correctional centre, two videos went viral, allegedly showing him inside the prison. The first clip showed Kanu collecting lunch, while the second depicted him eating while incarcerated.
These videos generated immediate controversy and suspicion of manipulation. Through DUBAWA’s investigation, we confirmed the videos were false, as analyses revealed several visual inconsistencies. Kanu’s distinct facial mark was absent from the footage, while the videos also included distorted figures and an unrealistic serving motion with an empty spoon.
Multiple AI-detection tools confirmed that the clips were digitally altered, as analyses showed high percentages of AI-generated content and deepfake indicators, including face replacement.
- Ahmad Gumi urged the Federal Government to include bandits’ welfare in the national budget
A short Facebook video showed controversial cleric, Ahmad Gumi, advising the Federal Government to negotiate with bandits and allocate funds for their welfare in the national budget. The clip, shared by “Prime Entertainment,” generated significant public outrage, though some commenters suspected it was a fake.
Given the sensitive nature of the claim amid Nigeria’s security challenges, DUBAWA conducted a fact-check. Manual analysis of the video revealed several anomalies, including the cleric’s unnatural body movements and a noticeable difference between the video’s voice and Gumi’s authentic voice.
To confirm suspicions of digital manipulation, we employed AI detection tools, including TruthScan AI and Attestiv Video Deepfake Detector, which both flagged typical signs of AI-generated deepfakes.
We also did not find a credible media report stating that Gumi advocated for bandits’ welfare in the budget. The combined evidence from audio and visual analysis confirms that the video is AI-generated.
- Video of Tinubu threatening Trump
Following Donald Trump’s order for the United States’ military to prepare for a possible war in Nigeria against Islamic extremists, a video surfaced on X. The clip allegedly showed Bola Tinubu dismissing Trump’s threat, followed by Trump insulting Tinubu’s competence.
Verification tools, such as Attestiv and Deepware, confirmed extensive lip and face replacement, flagging the viral video as highly suspicious and AI-manipulated.
Also, credible news outlets reported that Tinubu’s actual response was a promise to engage the world diplomatically to end terrorism, not a public dismissal of Trump. DUBAWA’s analyses revealed no evidence of the alleged confrontation.
- Trump, Traore trade words over Africa’s independence, relationship with the West
A TikTok audio clip supposedly featured a direct exchange between Donald Trump and Burkina Faso’s leader, Captain Ibrahim Traoré. The audio revealed Trump criticising Traoré’s anti-Western actions as “reckless,” while Traoré fiercely defended his expulsion of France, questioning America’s “billions” in African investment, given the continent’s persistent poverty.
The clip’s virality, driven by Traoré’s controversial pan-African stance, led to public confusion. However, DUBAWA’s verification confirmed the audio was not authentic.
Analysis with AI voice-detection tools, including AI Voice Detector and Hiya Deepfake Detector, indicated that the audio was a deepfake. The tools showed the audio had only a one per cent chance of being real, proving both voices were artificially created to mimic the two leaders.
Furthermore, extensive checks found no credible record of such an exchange. Also, the fact that Traoré does not speak English publicly made the purported audio more suspicious.
- Videos in local languages promoting fraudulent links on TikTok
Amid Nigeria’s economic challenges, multiple sponsored videos featuring animated characters speaking Yoruba and Hausa falsely promised money or easy loans if users clicked the “Apply Now” button. One clip even mimicked an advert for Opay, a popular Nigerian bank.
Desperate TikTok users were observed sharing sensitive account details to access the promised aid. Our verification using detection tools, such as AIorNot and Deepfake-o-meter, flagged the video as inauthentic. Some of the videos were created using tools like VEO.
Clicking the links redirected users to misleading external websites that either requested personal information (posing a phishing risk) or promoted unrelated services, such as foreign work visas. The videos were designed to exploit economic hardship and defraud unsuspecting Nigerians.
- Video showing flyover collapse in Lafia, Nasarawa State
A video circulated on Facebook barely a week after Nigeria’s president, Bola Tinubu, commissioned a one-kilometre-long flyover in Lafia, Nasarawa State. The footage depicting a collapsed flyover was falsely claimed to be the newly commissioned structure, with some posts erroneously citing a cost of N10 billion.
The false claim, which gained traction among users criticising the state government, was met with scepticism by others who suspected it was an AI trick. Verification confirmed these suspicions. A reverse image search traced the viral image to a five-second video shared online, with the original caption stating the video was created using Artificial Intelligence (AI).
Although deepfake detection tools did not conclusively flag the video, manual analysis revealed significant inconsistencies typical of AI generation. For example, humans and vehicles were unnaturally shown moving towards the collapsing structure, and figures were observed awkwardly merging.
Findings also revealed that the Lafia flyover cost N16 billion, not N10 billion. A partial collapse of a different grade-separated junction occurred at Keffi on Jul. 18, 2025, after an oversized truck rammed into it. DUBAWA’s findings confirm that the footage was a collage from an AI-generated video and that the recently commissioned Lafia flyover did not collapse.
