It started with a phone call that should have been routine. “Good afternoon, this is IT; we are just updating your password. Kindly share your MFA code for system sync.” The voice on the other end sounded professional, confident, and even polite. Within 90 seconds, the officer in charge of treasury operations at a mid-sized local bank had unknowingly handed over the keys to the vault. By the time the fraud was discovered, UGX 3.2 billion had quietly vanished through a chain of digital transfers, small enough to evade automated alerts and clever enough to appear internal. What made this case exceptional was not the sophistication of the hackers. It was the naivety of convenience. The illusion of safety Inside the bank’s headquarters, efficiency was everything. The CEO prided himself on “frictionless service delivery.” Password resets were streamlined, system logins synchronized, and approvals automated. Staff called it the convenience culture. One password for all systems, minimal downtime, and the comforting belief that “IT has it covered.” When Summit Consulting Ltd was called in two weeks later, the board was in shock. The IT Manager insisted no system had been breached. The Compliance Officer swore all procedures were followed. Yet the treasury account kept haemorrhaging money in small, calculated withdrawals. This was not a brute-force attack. It was a culture breach. The first crack During the initial audit, Summit’s digital forensics team pulled server logs. They found that the initial access came from a legitimate device, a bank-issued laptop belonging to Suspect 1, a mid-level treasury officer. The investigators dug deeper. Behind the scenes, a remote-access application had been installed on his laptop, disguised as an update patch from IT. The malware did not just open the door. It stayed quiet, observing. For weeks, it captured credentials, clipboard data, and screenshots. How did it get there? Summit’s forensic image of the drive revealed the truth, a WhatsApp file transfer. The convenience trap “IT had told us email attachments were risky,” Suspect 1 explained during his interview. “So, we started sharing updates over WhatsApp instead. It was faster.” A colleague had shared what looked like an Excel macro update. The moment it was opened, a remote-execution script embedded in the file silently installed the backdoor. From that day, every click, password, and transaction entry was monitored by an external actor believed to be a former bank contractor. The irony? The breach originated from an alternative communication channel meant to make work “easier.” Convenience, again, had become the enemy. The anatomy of the heist Using harvested credentials, the attacker created a shadow approval workflow. They knew the internal routines; when managers logged in, when auditors checked balances, and which signatures were often delayed. Funds were routed in micro-transfers of UGX 18–25 million, labelled as vendor refunds and forex settlements. The accounts used were genuine, inactive client accounts reopened through social-engineered email approvals. To avoid suspicion, every transaction mirrored a legitimate one from the previous week. The total loss was masked under “system suspense clearing.” No alarms went off because nothing appeared abnormal. The transactions came from authorized users within working hours. The silent witnesses Inside the bank, several people saw red flags, but convenience muted their instincts. The internal auditor noticed the unusual pattern of “duplicate” transfers but assumed it was a system reconciliation. The compliance officer received an email query about the same vendor twice, but approved it because “the boss was traveling.” The system admin ignored a strange login because the credentials belonged to a trusted colleague. Every time someone felt something was off, they brushed it aside for the sake of speed. In the final analysis, Summit’s report noted, “This incident was not a hack. It was a harvest of human trust and organizational complacency.” The unmasking Forensic tracing led to a digital fingerprint left behind on the command server. The IP bounced through multiple VPNs, but one session slipped, revealing a location in Bukoto. Summit’s cyber team, working with law enforcement, mapped transactions through local mobile money aggregators. Some of the cash-out accounts were registered under stolen national IDs, but one SIM card connected to the same wallet had been used to pay a Yaka (UDCL Light) bill in the name of Suspect 2, a former staff member terminated six months earlier. When confronted, Suspect 2’s response was chilling: “They made security so easy I thought it was a test.” The forensic unravelling Summit’s forensic reconstruction revealed how the breach evolved in five stages: Access by deception. A fake WhatsApp file disguised as an Excel update installed remote-access malware. Credential capture. Keystrokes and screenshots were harvested silently. Privilege escalation. Compromised admin credentials were used to modify transaction approval queues. Transaction laundering. Micro-transfers funnelled through dormant customer accounts to evade detection. Cash-out & cover-up. Funds withdrawn via mobile aggregators and converted to cryptocurrency within 72 hours. The attackers understood one truth: in a culture of convenience, nobody double-checks what looks familiar. The emotional aftermath The boardroom was silent as Summit presented its findings. The chairperson, visibly shaken, asked, “So our systems were not hacked?” “No,” the lead investigator replied. “Your habits were.” The bank’s leadership realized they had invested in technology but neglected behaviour. They had built a fortress, but left the gate open for speed. The hardest pill to swallow was accountability. Every control had existed on paper. Every policy had a signature. Yet enforcement depended on human discipline, not design. The culture audit Following the investigation, Summit conducted a cyberculture audit. The results were eye-opening: 73% of staff reused passwords across systems. 58% admitted forwarding work documents via personal cloud email or WhatsApp. 41% had admin rights they no longer needed. Only 12% had ever changed default passwords on their devices. Convenience was not a behaviour. It was a system. Summit’s final report framed it starkly: “The institution achieved operational efficiency at the cost of digital resilience.” Redefining security The transformation that followed was painful but profound. Digital discipline became policy. All devices require MFA. WhatsApp file
The CFE Journey begins this November 2025 at Institute of Forensics & ICT Security (IFIS)
If you have ever felt the pulse of a financial irregularity, the sudden spike, the hidden transaction, the silent whistle-blower, then you know this truth: the business of fraud is not simply accounting. It is a battlefield. And the credential that marks you as a qualified warrior in that battle is the Certified Fraud Examiner (CFE), issued by the Association of Certified Fraud Examiners (ACFE). This November 2025, IFIS is your launchpad. It is time to move from being a good auditor or risk professional to becoming the fraud investigator organisations call when the fire hits. Here is your roadmap. The whys, the hows, and the strategic counsel to get you from today to credential-holding, career-elevated, fraud-fighting. Why the CFE makes sense and why now? Market demand meets prestige. Fraud is not going away. In fact, it is morphing. The ACFE’s vast body of research shows that organisations with CFEs on staff detect fraud faster and incur smaller losses. A tangible earnings premium. According to ACFE’s Compensation Guide. CFEs earn, on average, 32% more than their non-certified counterparts. For those with four-year degrees, median compensation among CFEs is approximately US $101,000, and for graduate-degree CFE, it is about US $112,000. In Africa/the Middle East, median CFE compensation was reported at around US $52,000, still significantly higher than many local peers. Professional credibility and organisational impact Holding the CFE signals you are more than an accountant or auditor; you are a trained investigator. CFEs bring “higher fraud detection, professional credibility, career advancement, and enhanced marketability.” In short, if you are working in internal audit, risk, compliance, or investigation (like many of you reading this), the CFE is the credential that marks the shift from good to elite. IFIS gives you the local gateway to that global benchmark. The eligibility and exam framework: Here is a breakdown of how to qualify and prepare, so you know exactly what you are signing up for Eligibility You must be an Associate Member of the ACFE. Use the ACFE “points system”: you need a minimum of 40 points to sit the exam, and 50 points to earn the credential. A bachelor’s degree typically grants ~40 points. If you do not have a degree, you may substitute two years of fraud-related professional experience for each year of academic study. You must have at least two years of professional experience in a field related, either directly or indirectly, to fraud detection/deterrence by the time you are certified. You must agree to the ACFE Code of Professional Ethics. Exam structure Once you are eligible, the path follows: Join ACFE.com as an associate member→ 2. Prepare for the CFE exam by buying study materials from the ACFE and also coming to IFIS for study classes→ 3. Apply for the exam → 4. Sit the exam → 5. Get certified. The exam covers four sections: Financial Transactions & Fraud Schemes Law Investigation Fraud Prevention & Deterrence. Each section is a separate exam (~100 multiple-choice/true-false questions), and you must achieve at least 75% in each to pass. Timeline and strategy Given you plan to begin this November 2025 at IFIS, here is a suggested timeline: Join ACFE, and begin your preparatory reading. November (IFIS class start): Attend IFIS guided sessions, form a study group, start weekly topic deep-dives. February-March 2026, Sit each of the four sections (or grouped as allowed by ACFE). On passing, gain the official CFE credential, commence leveraging it for career advancement and organisational value. Career guidance. Where the CFE takes you and how to position it There are several career paths you can aim for Internal Auditor (audit, risk & controls) External Auditor (public accounting firms) Compliance or Risk Management Professional Fraud Investigator / Examiner (private or public sector) Loss Prevention or Asset Protection roles How to use the CFE to stand out Leverage your current skills: If you are in internal audit, highlight your fraud-related work to meet eligibility and to show value. Show organisational impact: Use metrics, e.g., faster fraud detection, reduced losses. ACFE reports organisations with CFEs detect fraud ~40% sooner and have ~54% smaller losses. Network globally, act locally: Join the ACFE community, engage in its job board and webinars, but tailor focus to the East African context (mobile money risk, cyber-fraud, regulatory growth). Position for leadership: The credential not only earns more money; it signals readiness for roles with greater responsibility (manager, director). The Compensation Guide shows median compensation increasing significantly with the level of responsibility. Market the credential in your region: In Uganda, Kenya, Nigeria, etc., your “CFE” becomes a differentiator in insurance, banking, audit, and compliance hiring. Emphasise your global credentials plus local application. Five practical steps you must take right now Join ACFE today! Activate your membership, and you will start getting benefits and begin accumulating eligibility points. Conduct a gap analysis, identify your education points, and fraud experience points. If below 40, plan the experience or education to fill the gap. Register for the IFIS November class, commit now, block your study schedule, get your colleagues and manager informed (to support you). Design a study plan; e.g., 12 weeks before exam, break each of the four domains into weeks, include mock-tests, flashcards, peer-review sessions. Align your current role to your CFE journey; include fraud-related tasks in your performance goals, start logging fraud detection/prevention actions, and build your “case history” (for your resume and for ACFE’s experience requirement). Why now is the strategic moment for East Africa The financial services and audit sectors are increasingly under pressure, with rising regulation, greater digital/mobile-money risk, and stronger governance expectations. Organisations are seeking credible fraud–risk specialists. When you hold a globally recognised credential (CFE) and understand local mobile-money, SACCOs, and fintech risk in Uganda, you’re rare. By starting in November, you will be ahead of the curve: many professionals delay; being early gives you a first-mover advantage within your organisation or region. For the board/internal-audit context you already operate in (you are Head of Internal Audit in Nigeria), the CFE
When hackers knock: What we learnt at the Cybersecurity and Risk Management Conference 2025
It began with a knock. Not the kind that rattles your door at midnight, but the digital kind. The kind you don’t hear. The kind that seeps in through an innocent-looking email marked “Request for Quotation.” It was 2:13 a.m. on a Saturday in June when the first breach happened. The institution, a mid-sized government agency with offices along Jinja Road, had just completed its payroll run. The staff were asleep. The finance director was abroad. The systems administrator had left his token in a drawer “for convenience.” By Monday, UGX 1.2 billion had vanished; quietly, elegantly, and with surgical precision. That was the story that opened this year’s Cybersecurity and Risk Management Conference 2025, hosted by the Institute of Forensics and ICT Security (IFIS) at Speke Resort Munyonyo. And as each expert took the stage, it became clear: Uganda’s cyber war is no longer theoretical. It’s personal, psychological, and institutional. The calm before the breach Suspect 1 was a former IT officer. He knew the system well enough to exploit its blind spots but not well enough to fix them. He’d left the organization two years earlier after being denied a promotion. But his digital fingerprints remained. The agency had never deactivated his admin credentials from the legacy accounting platform. “It was just one of those things we’d get to later,” said one internal source during the post-mortem. Later never came. Using a VPN and public Wi-Fi from a café in Ntinda, Suspect 1 logged in with his old credentials. No alarms. No two-factor authentication. Within ten minutes, he had full access to the payments module. He then created four new supplier profiles. Each bore legitimate-sounding names: Kampala Supply Traders Ltd, Vision Industrial Parts, Equity General Solutions, and Mubende Agro Works. The National IDs used were genuine; they belonged to real people hired as boda riders and casual laborers, each paid UGX 50,000 for “helping open accounts.” The first red flag appeared two weeks later: the internal auditor noticed multiple supplier payments with near-identical narrative descriptions: “Supply of stationery.” But instead of investigating, she was told to “wait until next quarter’s audit.” That hesitation cost the agency almost a billion shillings. How the money moved Here’s how the fraud worked. Each ghost supplier had a mobile money-linked bank account. Once the payments cleared, the funds were split into smaller UGX 4–6 million transfers, sent to over 30 wallets. Some of those wallets belonged to staff relatives. Others to mobile money agents near Wandegeya, Kamwokya, and Kyengera. “Follow the cash, not the crime scene,” said the lead investigator from Summit Consulting Ltd, which was called in after the breach. Summit’s digital forensics team traced the movement of funds across three telecom platforms. Within hours, patterns emerged: transfers made between 2–3 a.m., withdrawals in batches of five, and multiple SIMs registered under a single national ID; a textbook indicator of a collusive scheme. “Every fraud has two halves,” explained the Summit investigator. “The insider who knows the door, and the outsider who knows when it’s open.” Their forensic reconstruction revealed that Suspect 1 had an accomplice; Suspect 2, the agency’s current accounts assistant. Suspect 2’s role was simple but crucial. He approved the transactions using his supervisor’s token while the supervisor was “on travel.” He’d convinced him that “system delays” required him to leave the token in the drawer “for continuity.” That drawer, investigators later discovered, was the breach’s front door. The breach beneath the culture At the conference, Summit’s presentation drew gasps; not because of the technical sophistication, but because of how ordinary the setup was. The agency had invested over UGX 300 million in cybersecurity tools. Yet none of them mattered because human culture was the weakest link. There was no segregation of duties. Tokens were shared. Audit logs weren’t reviewed. And the internal auditor lacked system access to monitor real-time transactions. One delegate whispered, “This could be any of us.” He was right. Ugandan institutions often think cyber risk is about hackers in hoodies. The real enemy, as we learnt, is organizational comfort; the belief that loyalty equals security. “We trust our people too much,” said one CEO on the panel. “But trust without verification is the breeding ground for fraud.” The forensic breakthrough Summit’s digital forensics lab, operating from Nakasero, recreated the full digital trail using forensic imaging. Every keystroke, timestamp, and token approval was reconstructed. A chilling detail emerged: Suspect 1 had logged in on Independence Day, a public holiday. No one noticed. The intrusion lasted only 17 minutes. Within that window, he created and approved four payment vouchers worth UGX 860 million. How did he get the approvals through? He used an automation script to mimic the payment workflow, leveraging the shared token credentials of the finance manager. The script executed instantly, bypassing human verification. When Summit Consulting presented this sequence during the conference, the room went silent. Because everyone realized: the system wasn’t hacked from outside. It was used exactly as designed. The anatomy of insider collusion Summit’s report to the board detailed a textbook example of insider-enabled fraud: Role Action Control bypassed Suspect 1 (ex-IT staff) Accessed the system using old admin credentials User deactivation control Suspect 2 (accounts assistant) Processed ghost supplier payments Token misuse and weak supervision Finance manager Left the token unsecured Poor physical control Internal auditor Deferred review Lack of real-time audit visibility HR Failed to ensure exit clearance Weak access offboarding Each small negligence formed a chain. Together, they became a breach. “Fraud isn’t a single act of genius,” said the Summit consultant. “It’s a series of small permissions.” How Summit cracked the case The investigation took 11 days. The first 72 hours were the hardest. Logs had been deleted. Tokens were “missing.” HR insisted Suspect 1 left cleanly. But Summit’s forensics team pulled a digital rabbit out of the hat. They recovered fragments of deleted logs from a backup server that had been “ignored” by IT. These logs showed a unique pattern: every intrusion occurred within
Crack the code of corporate fraud: Become a Certified Fraud Examiner this November.
The first clue was not a missing file. It was a smile. A senior procurement officer at a well-known NGO, let’s call her Suspect 1, walked into work every day with disarming calm. She spoke softly, prayed before meetings, and was everyone’s go-to person for “urgent payments.” When auditors came, she baked muffins. For three years, she processed clean transactions until one junior accountant noticed something strange: three supplier invoices, printed on different letterheads, but sharing the same email domain. That single inconsistency unravelled a fraud web worth UGX 2.7 billion; money meant for maternal health projects in Gulu and Soroti. By the time the dust settled, the NGO’s reputation was shredded. The donors were furious. And the board, stunned, asked the same question every Ugandan institution eventually does: “How did this happen under our watch?” The answer was simple: no one in the room could think like a fraud examiner. The invisible crime wave Fraud is not loud. It’s quiet. It doesn’t kick down your door. It whispers through your payroll system. It hides in a decimal point. It wears a smile. Every year, Ugandan companies lose billions to what accountants politely call “irregularities.” But behind that polite term are real betrayals; finance managers approving ghost payments, internal auditors signing what they never reviewed, and system admins deleting evidence just before resigning. You don’t read about most of these cases because they’re buried in board minutes and NDAs. But if you sat in one of Summit Consulting Ltd’s investigations, you’d see the real movie. A CEO stunned by evidence he never imagined. A driver-turned-middleman ferrying cash to mobile money agents. And a junior staffer trembling as forensic analysts pull up the logs that tell the story his mouth cannot. That is the world of Certified Fraud Examiners (CFEs); the world behind the spreadsheets. The mind of a CFE. A Certified Fraud Examiner is not a detective with a magnifying glass. They are a strategist with a microscope. They see what others miss: the way a payment is approved at 8:17 p.m., always by the same person. The supplier who always wins tenders by one decimal point. The staff member who never takes leave because their fraud would unravel in their absence. To a CFE, every inconsistency is a clue, every anomaly a confession. And this November, Uganda’s next generation of CFEs will gather at the Institute of Forensics and ICT Security (IFIS) for the country’s most rigorous fraud investigation program, powered by Summit Consulting Ltd and certified by the Association of Certified Fraud Examiners (ACFE, USA). It’s not a workshop. It’s a battlefield simulation. Inside the training room. Day one begins with a sealed envelope marked Case 004 – The Ghost in Accounts. Inside: photocopies of supplier invoices, a list of suspicious transactions, and an anonymous whistleblower email. Participants work in teams. They cross-reference bank statements. They map out shell companies. They examine phone records. By the second hour, one team notices that the “supplier” shares an address with the agency’s own office. Another connects WhatsApp screenshots showing a procurement officer asking, “Have you received the fuel refund?” By day three, they’ve cracked it. They’ve reconstructed an entire fraud ring from fragments. Then Summit’s lead investigator walks in and says, “Congratulations. That was a real case we investigated. Everything you uncovered actually happened in Uganda.” The room goes silent. Some laugh nervously. Others stare. Because suddenly, theory has turned into truth. The crimes no one talks about. Fraud in Uganda has evolved from forged cheques to digital deception. In one 2024 case, a telecom engineer created “test SIM cards” for internal network trials, but later activated them for personal use. Each line was loaded with airtime and data meant for customers. Over 11 months, the scheme siphoned UGX 600 million. In another, a microfinance manager colluded with a mobile money agent to register ghost borrowers. The repayment system showed “active loans,” but the cash was being cashed out in Kyengera every Friday morning. And perhaps the most chilling, an insurance company’s data officer who altered claim details using stolen logins from a deceased colleague. Each case had one thing in common: the organization lacked a trained fraud examiner who could see the pattern early enough. Why November matters The November CFE intake is not just another course. It’s a national necessity. Uganda’s corporate world is entering a dangerous phase, with more systems, more data, and more vulnerability. Boards are approving digital transformation projects faster than they can understand them. Risk managers are drowning in compliance checklists while fraudsters automate their crimes. “The next big corporate scandal won’t start in IT,” said one panellist at the Cybersecurity Conference. “It will start in Finance and end in Forensics.” This is why IFIS, in partnership with Summit Consulting, designed the November CFE training as an investigative laboratory. Every lesson comes with a local case: How an insider diverted donor funds using payroll ghosting scripts. How Summit traced embezzled money through mobile wallets. How a forensic trail led investigators to a café in Ntinda, where the suspect logged in on public Wi-Fi. This is not an imported theory. It’s Uganda’s corporate underworld; decoded. From accountant to investigator. The CFE program transforms ordinary professionals into strategic fraud defenders. You will master four domains: Financial Transactions & Fraud Schemes – Learn how money moves in the shadows. Law – Understand evidence so airtight it survives court cross-examination. Investigation – Conduct interviews that extract truth without intimidation. Fraud Prevention & Deterrence – Design systems that make fraud unattractive. And when you pass, you join a global elite; over 90,000 Certified Fraud Examiners across 180 countries; trusted by banks, regulators, and governments to detect deception before it becomes a disaster. The Uganda story behind the credential In Uganda, CFEs are the unsung heroes behind high-profile recoveries. They’re the ones who sit behind the “management decision” headlines. It was a CFE who traced UGX 1.2 billion siphoned from a donor-funded project through a chain of boda riders’
How one click becomes a crisis in an organization
October is still on, the cybersecurity awareness month. Across Uganda, thousands of employees will walk into offices, open emails, and click without thinking. One careless click is all it takes. That is why Cybersecurity Awareness Month is not just another “theme month.” It is a survival drill. Cybercrime is no longer a distant story from America or Europe. It is a Ugandan reality. From SACCOs in Masaka losing millions via mobile money SIM swaps, to hospitals in Kampala locked out of patient records by ransomware, to government agencies paying ransom quietly, cyber risk is here. It is local. It is expensive. And it is growing. Why awareness matters The biggest myth in cybersecurity is that technology alone will protect you. Firewalls, antiviruses, and fancy dashboards mean nothing if your people are blind to threats. Eight out of ten breaches in Uganda begin with human error: a staff member reusing passwords, downloading fake invoices, or sharing sensitive data on WhatsApp. Cybersecurity Awareness Month exists to break that ignorance. To remind every staff member that they are the first firewall. The cost of ignorance A Tier 2 bank lost UGX 1.4 billion in a single phishing campaign. A private school had its entire website defaced, leaving parents questioning its credibility. An NGO donor froze funding after hackers exposed project data. None of these began with a “major hack.” They began with an ignored awareness. What must leaders do this October? Make cybersecurity cultural, not seasonal – One month of slogans is not enough. Embed cyber habits into daily work. Invest in drills, not posters – Staff remember simulated phishing tests, not motivational banners. Hold EXCO accountable – Cyber risk is a governance issue. Boards must demand evidence of preparedness, not promises. At Summit Consulting Ltd, we say: “Cybersecurity is not IT. It is a survival strategy.” This month, we are running free awareness trainings for organizations that dare to take risk seriously. One hour with us could save your organization billions. The question is not whether hackers will strike. It is whether your team will recognize the attack when it happens. Stay aware. Stay protected. Visit https://forensicsinstitute.org/ to access free resources. Be safe online. Cybersecurity Month 2025: Stay aware. Stay protected. One careless click. That’s all it takes to lose millions. Cybercrime is no longer a foreign headline; it is Ugandan. A SACCO in Masaka was wiped out via SIM swaps. A Kampala hospital was locked out of patient records by ransomware. A Tier 2 bank is losing UGX 1.4 billion to phishing. None of these started with “big hacks.” They started with ignorance. The truth? Technology won’t save you if your people are blind. Eight out of ten breaches in Uganda begin with human error, weak passwords, fake invoices, or sharing data on WhatsApp. This October, the Institute of Forensics & ICT Security, a technical training arm of Summit Consulting, is leading Cybersecurity Awareness Month. We are offering free awareness training to organizations ready to treat cyber risk as a governance issue, not an IT problem. Boards, EXCOs, and CEOs: stop asking “What if hackers strike?” Start asking: “Will my team recognize the attack when it comes?” Register your team today. Save your reputation tomorrow. events.forensicsinstitute.org. Stay aware. Stay protected.
Tomorrow, We Secure the Future at the Cybersecurity Conference, 2025
When Uganda’s first mobile money platform launched in 2009, few could imagine it would one day carry the country’s economy. By 2025, over 33 million Ugandans will transact daily using their phones, buying food, paying fees, and even funding campaigns. The phone has become the bank, the ID, and the ballot box of modern life. But with every innovation comes a shadow. Beneath Uganda’s digital revolution lies a quiet war; one not fought with guns or politics, but with passwords, deception, and misplaced trust. And as the nation edges toward a more connected, election-season future, the question is no longer if we can go digital, but how securely we can stay that way. Tomorrow’s Uganda depends not on bandwidth, but on cybersecurity. The invisible economy of trust Every phone in Uganda is now a node in a massive digital network; a web of trust linking citizens, telecoms, banks, hospitals, and government services. Each transaction carries identity, intent, and value. The moment that trust falters, the entire ecosystem trembles. “Fraud today is not about hacking systems,” says a senior investigator at Summit Consulting Ltd, Uganda’s leading forensics and risk advisory firm. “It’s about hacking people.” Consider a recent case. At a hospital in Kampala, Suspect 1, posing as a telecom technician, convinced a nurse to read out a “verification code” she had just received on her phone. Within minutes, her WhatsApp and mobile wallet were hijacked. No malware. No brute-force attack. Just persuasion. This is what cybersecurity experts call social engineering: the weaponization of trust. It’s low-tech, fast, and devastating. The rise of such schemes marks a turning point. The old assumption that cybersecurity belongs to “IT departments” no longer holds. In 2025, everyone with a phone is part of the national security infrastructure. Identity: the new battlefield In the connected economy, identity has become currency. And whoever controls identity, controls trust. Telecom fraud no longer stops at SIM swaps. Criminal networks now reconstruct full digital identities, linking stolen ID photos, leaked phone numbers, and scraped social media data to create near-perfect clones of real users. In one investigation, Suspect 2, an insider at a mobile money aggregator, leaked subscriber data to an external ring that used the information to open “ghost accounts.” These accounts mimicked real users, same names, same dates of birth, and same photos. By the time anomalies were detected, a lot of money had moved through the network, all appearing legitimate. What makes identity fraud dangerous isn’t just financial loss; it’s systemic confusion. When fake and real identities overlap, accountability collapses. Who sent that money? Who posted that message? Who owns that number? In a politically sensitive season, such questions blur the line between cybersecurity and democracy itself. Future security, therefore, won’t rely on stronger passwords but on verified digital identities; encrypted, biometric, and traceable across systems. Uganda’s telecoms are beginning to explore blockchain-backed KYC (Know Your Customer) models, where identity verification becomes both private and auditable. But adoption remains slow. The insider problem Technology evolves faster than ethics. The most advanced system can still fail if the wrong person has the right access. According to Summit Consulting’s Project Frontline 2025 analysis, insider collusion accounts for over 65% of telecom-related fraud in East Africa. The typical case involves a mid-level employee, customer care, field operations, or data entry, who bypasses security protocols under the guise of “helping a customer.” In one case study, Suspect 3, a call centre agent, approved over 40 SIM swaps for “VIP customers” in a single week. Each transaction followed the procedure, was logged correctly, and passed internal checks. The fraud was only discovered when multiple clients reported losing access simultaneously. How did this happen? The answer wasn’t technical. It was cultural. Local telecom and financial industries, like many across Africa, are built on hierarchical trust; a culture where questioning authority can be mistaken for insubordination. Fraudsters thrive in such environments because silence is predictable. As the iShield Project 2025 notes, “Fraud doesn’t hide from systems. It hides in courtesy.” The future-ready solution is not more controls; it’s distributed accountability. Systems where no individual can both initiate and approve a high-value transaction. Where anomaly detection flags not just numbers, but behavioural deviations; logins at odd hours, unusual keystroke rhythms, sudden access to restricted modules. When data whispers, leaders must listen. The irony of modern fraud is that it’s often invisible, not because it’s hidden, but because it’s ignored. Every fraudulent transaction leaves a trail: timestamps, device IDs, and IMEIs that tell a story for those patients enough to listen. Yet most organizations drown in data but starve for insight. At Summit Consulting’s Digital Forensics Lab, investigators use AI-driven visualization to detect what the human eye misses: clusters of mobile transactions that occur at the same time, from the same device, across multiple accounts. The resulting heat maps are stunning: patterns that look ordinary on paper suddenly glow red with intent. In one such case, a telecom’s internal analytics flagged nothing unusual. Summit’s visualization revealed that all “routine” midnight transactions came from a single IP range, an employee dormitory near the data centre. The breach was not technical. It was behavioural. Future-ready risk management will depend on behavioural analytics; systems that don’t just secure data but learn from it. In an era where AI can detect emotional tone and typing speed, predicting insider risk is no longer science fiction. It’s a leadership necessity. From firewalls to digital immunity Uganda’s cybersecurity challenge isn’t about firewalls; it’s about mindset. In a country where most users think antivirus software is “for computers only,” building a cyber-resilient culture requires a shift from fear to literacy. Every citizen, from boda rider to banker, needs to understand that protecting their PIN is not paranoia; it’s patriotism. Telecoms are beginning to introduce “digital immunity” programs; training their agents to recognize social engineering, enforce transaction limits, and use biometrics for high-value approvals. Some banks now run internal “red team” exercises, where ethical hackers simulate real attacks to test employee readiness. Summit
Password 123456: What could possibly go wrong?
When the fraud finally unravelled, it wasn’t through a sophisticated cyberattack or a shadowy hacker operating from a foreign server. It began, as many local fraud stories now do, with a six-digit code: 123456; typed casually into a phone. It was like a usual Thursday morning at a hospital when “Suspect 1,” a man dressed in a faded telecom-branded jacket, approached a nurse outside the outpatient wing. He spoke softly, like someone used to fixing problems others didn’t understand. “Madam, your SIM card needs verification. The system shows an update error. Please read me the code that just came to your phone.” She did. Within minutes, her WhatsApp was gone, her mobile money emptied, and her reputation compromised. No malware. No brute force. Just a human voice, a believable story, and six innocent digits. The illusion of safety Ugandans have come to see their phones as symbols of progress. Inside a single device lives the wallet, the ID, and the business ledger. Yet, in a country where over 30 million people rely on mobile money daily, the phone remains the least protected asset they own. Telecom fraud has become more human than technical. It no longer happens in dark server rooms but in bright daylight, through cloned SIMs, insider approvals, and misplaced trust. In one recent case, Summit Consulting Ltd, the firm called in to investigate a suspected data breach at a leading telecom agent, discovered that the root cause wasn’t software failure. It was a supervisor’s compassion. “Suspect 2,” a well-rated back-office employee, had approved a SIM swap for what she believed was a colleague’s sick mother. No verification, no ID scan, just empathy. The swap opened a gateway to over 200 million shillings in fraudulent withdrawals before anyone noticed. “We like to believe fraud happens because systems are weak,” said one investigator at Summit Consulting. “In truth, it happens because people are predictable.” The power of predictability In cybersecurity circles, “123456” has become a global joke; the world’s most common password, used by millions who assume no one would bother guessing it. Yet here in Kampala, it’s more than a punchline. It’s a mindset. Telecom engineers reuse it for testing accounts. Agents use it as a default PIN during customer registration. Bank tellers use it for their logins to emails. Even internal training systems default to it for convenience. The problem isn’t the number; it’s what it represents: human laziness disguised as efficiency. Summit’s forensic team once traced a fraud ring that had compromised over 50 dormant SIM cards. The team expected an advanced exploit or a stolen master key. Instead, they found that every account used the same default password left unchanged since activation. “The password wasn’t hacked,” one analyst explained. “It was inherited.” The cost of convenience, as it turned out, was 1.2 billion shillings in unauthorized airtime and mobile money transfers. Inside the breach In most telecom fraud investigations, the trail doesn’t lead to an outsider; it leads to the inside. Fraud, in its most modern form, has become a quiet partnership between colluding employees and external agents. Take the case of “Suspect 3,” a retail agent operating from a dusty roadside kiosk in Mbarara. Every evening, he processed dozens of SIM swaps from “urgent corporate clients.” The data came from an insider, someone with system privileges, who sent him lists of numbers and ID details scraped from internal servers. The two shared profits through mobile money. When the fraud was finally detected, the system logs told a quiet but damning story: identical terminal IDs, midnight approvals, and transactions moving in neat 10-minute intervals. It wasn’t hacking. It was routine. The auditors had missed it for months because it looked too organized to be suspicious. The data never lies Telecom fraud, like all digital frauds, is never invisible. It always leaves digital fingerprints: timestamps, IP addresses, and login sequences that form patterns only data analytics can see. In one Summit Consulting forensic case, investigators visualized six months of mobile money activity on a heat map. What emerged was chilling: a single geographic cluster processing transactions between 1:00 a.m. and 3:00 a.m. daily, all linked to the same back-end approval node. The fraudsters weren’t hiding. They were working inside the system, confident no one was watching. “What kills organizations is not lack of data,” Mr Strategy noted in his post-investigation briefing to a client’s Audit Committee members. “It’s the refusal to listen to what data is screaming.” Even the simplest dashboard can flag fraud if someone pays attention. Yet, many institutions mistake dashboards for control, forgetting that detection without action is still negligence. Culture: the invisible firewall Every company –telecom or banks- boasts of firewalls, encryption, and biometric verification. Yet, the most effective control remains human behaviour, and that’s where Uganda’s telecom sector faces its greatest challenge. Fraud thrives in cultures where silence is rewarded, and questioning authority is frowned upon. When employees fear raising red flags or believe that “reporting a colleague” is betrayal, controls crumble. In one telecom case investigated by Summit Consulting, an internal audit officer admitted, off record, that he had noticed the irregular SIM swap pattern but didn’t escalate it because it involved a “high-performing staff member.” The loss? Nearly 181.4 million shillings. As Mr Strategy often says, Cybersecurity is not a department; it’s a culture of disciplined doubt. That means separating duties so no one can approve and verify the same transaction. It means training staff not to trust, but to verify. And most importantly, it means rewarding curiosity, not compliance. Building immunity, not walls As our digital economy deepens ahead of the 2026 elections, telecom networks will face their most complex tests yet. Disinformation, identity fraud, and insider manipulation will intersect in ways unseen before. Yet the solution is not more secrecy or blame. It’s transparency. Organizations must build immunity, not walls. Immunity comes from openness, collaboration, and real-time monitoring. When fraud happens, the first question shouldn’t be “who leaked?” but “which control failed and why?” Summit Consulting’s
Cybersecurity awareness month; Day 10, October 202,5, issue 10 of 30: Election integrity through cybersecurity
As Ugandans prepare for elections, remember that the integrity of an election is no longer guaranteed by ballot boxes or transparent counting. It is guaranteed by secure data. Modern elections are information systems disguised as civic events. Every stage, from voter registration to results transmission, relies on digital infrastructure that can be corrupted long before a single vote is cast. The real battlefield is data integrity Every democracy stands on a digital skeleton: databases, networks, and authentication systems. If any bone is weak, the entire process collapses. A compromised voter register could silently disqualify some genuine voters. A hijacked transmission channel alters perception before verification. Compromised election results transmission could lead to unrest, are inaccurate results manipulated through hacking could be announced, thereby leading to false expectations. Cybersecurity, therefore, is not a technical add-on; it is the architecture of trust. The threats are systemic, not spectacular. In Uganda and across emerging democracies, the danger is not hackers in hoodies. It is insiders with access, temporary staff without training, and contractors using unsecured devices. When election data travels through flash drives, email attachments, or third-party servers, the opportunity for manipulation is constant. Cybersecurity provides the discipline, access control, encryption, and audit logs that ensure no one can quietly rewrite history. The fact is, cybersecurity equals credibility. An election’s legitimacy depends on evidence. When logs are immutable, every change has a signature. When databases are hashed and verified, every dataset can be proven authentic. When access requires multifactor authentication, insiders can no longer act invisibly. Cybersecurity converts process into proof. Without it, all that remains is faith. How to build resilience to prevent a crisis. Cybersecurity for elections is not about responding to hacks; it is about preventing suspicion. This can be achieved through: a) Voter data governance. This is the foundation of election integrity. It ensures that every citizen’s record is collected, stored, and maintained without alteration or loss. The first step is secure capture. During registration, data should be collected using tamper-proof devices connected to a secure network, not through shared laptops or unprotected USB drives. For example, if a registration officer in Gulu uses a tablet to capture details, the data should immediately encrypt and sync to a secure central server, not sit on the device overnight. This prevents local tampering and accidental leaks. The second step is encrypted storage. Every record in the voter database must be encrypted both “at rest” (when stored) and “in transit” (when being transferred). Think of encryption as locking every voter’s file in a digital safe. Even if a backup drive is stolen, the data is unreadable without the correct key. In practice, this means using tools like AES-256 encryption for stored files and HTTPS/TLS connections for any transfer. The third step is verified backups. Regular backups protect against system failure or deliberate sabotage. But backups must themselves be verified. It’s not enough to say “we back up.” Each backup should be checked for completeness, encrypted, and stored off-site, say, one copy in a government data centre and another in a disaster recovery facility in a different district. A simple checksum or hash comparison between the main and backup data ensures nothing has been quietly altered. b) Access management,role-based rights, session timeouts, and real-time monitoring Access control determines who can do what, when, and how. The weakest system is the one where everyone can access everything. A good system has role-based rights. Each user, whether data clerk, supervisor, or IT administrator, must have access only to what they need. For instance, a district officer can update records for their district but cannot edit national data. Similarly, a helpdesk agent can view but not modify records. Role segregation prevents one insider from quietly manipulating entries without oversight. It also enforces session timeouts. Idle sessions are silent backdoors. If an officer logs in to the voter database and walks away, anyone passing by can make changes. Automatic session timeouts after 10–15 minutes of inactivity, combined with two-factor re-authentication, stop such unauthorized activity. It’s a simple discipline that saves millions in potential disputes. And above all, it enables real-time monitoring. Modern systems should record every access attempt, who logged in, from where, and what they changed. A monitoring dashboard can flag anomalies: e.g., “User X logged in at midnight from a different region.” Automated alerts to supervisors ensure accountability before damage spreads. A free election is one where no citizen is digitally excluded. A fair election is one where every data change is provable. When systems are tamper-evident, even disagreement cannot erode confidence. The absence of cybersecurity is the new form of disenfranchisement, quiet, technical, and irreversible. And as a Ugandan, I am happy for the voter registration campaigns that the government has been driving to get every citizen to participate in the next election. Bravo. c) Transmission integrity, end-to-end encryption, and public verification hashes for results When election results or register data move between systems, from polling centres to tally servers, integrity is everything. Two steps help achieve that: First, end-to-end encryption. Every transmission of results should travel in a secure, encrypted channel from origin to destination. Imagine each results file sealed in a digital envelope that only the official server can open. Even if the data passes through telecom networks or the internet, no one can read or alter it without detection. Using VPN tunnels or SSL/TLS ensures that what leaves a polling centre arrives unaltered at headquarters. Second, public verification hashes. Transparency builds trust. A hash is like a fingerprint for a file; if even one number changes, the fingerprint changes. By publishing verification hashes of official results or voter registers, the public and observers can confirm that the data they receive matches the authentic version. For instance, if a district tally sheet has a hash value “A9F3C…”, anyone can check that value against the published one to verify authenticity. This removes speculation and lets evidence speak. d) Incident response, pre-agreed protocol, and cross-party oversight for any anomaly No system is perfect.
Cyber Hygiene is Not an IT Issue but a Culture Issue
Imagine walking into a hospital. The walls are clean, the staff is dressed in neat uniforms, and everything looks perfect on the surface, but behind the scenes, one nurse decides to skip washing her hands “just this once.” Maybe she’s tired, or in a hurry, or thinks nothing will happen. What follows is devastating: an infection spreads to patients and colleagues alike. Those who never broke protocol still suffer the consequences of one person’s negligence. This is exactly how cybersecurity works. One weak link, one careless act, can expose the entire organization to risks that no amount of sophisticated technology can fully contain. Cyber hygiene is not about the IT department working in isolation but about every single person, regardless of title, becoming a guardian of the organization’s digital well-being. When we talk about cyber hygiene, we are talking about the small, daily, almost invisible actions that build resilience. Just like washing hands, they are simple, but they matter more than we often realize. What does good cyber hygiene look like? Locking your screen: every time you step away, even if it’s just for a minute. Enabling Multi-Factor Authentication (MFA) and resisting the urge to disable it when it feels “inconvenient.” Updating your system promptly: not postponing it for “later” (which usually never comes). Refusing to share credentials: even with that trusted colleague who “just needs to check something quickly.” Being curious and cautious: about every link, attachment, or message, because cybercriminals thrive on trust. Sadly, in many workplaces, we have normalized cutting corners. We laugh at strong password requirements, we dismiss those “security pop-ups” as annoying, we say, “I trust my team,” and treat protocols as optional rather than essential. Trust without verification is not culture, it’s carelessness. An organization’s cybersecurity posture is only as strong as its weakest cultural link. It’s not the firewall or antivirus that will save you; it’s the everyday discipline of your people. Cyber hygiene must become second nature, like fastening a seatbelt or washing hands before surgery. And so, I challenge you: Who is your biggest cybersecurity risk? Is it your IT system… or is it your culture? Take a moment to answer this poll: “Who is your biggest cybersecurity risk?” We shall share the results at the Cybersecurity and Risk Management Conference on Thursday, 16th October 2025, at Speke Resort Munyonyo. Register:https://event.forensicsinstitute.org/ This conference will unpack not just the technology, but the human behaviors that shape organizational resilience. It is a conversation no serious leader should miss. Book your free cybersecurity session here and save up to UGX 5 million. Small habits save big money.
Cybersecurity Awareness Month; Day 9 October 2025 issue 9 of 30: The illusion of small money
Why SACCOs are prime targets. In March 2025, a rural SACCO in Mbarara lost UGX 64 million. No hacker was involved. No firewall was breached. The loss was purely internal, orchestrated by an insider who approved loans to ghost members and routed funds to mobile wallets owned by associates. When the police cyber unit was called, the investigators found a simple truth: the fraudster never needed to hack the system; he only needed access to a logged-in computer. Small SACCOs assume they are beneath the radar of cyber criminals. That assumption is their first vulnerability. Fraudsters target them precisely because they are small, where one staff member often handles accounting, teller, and system administration. In such setups, segregation of duties is a dream, not a practice. The illusion of “low risk” creates a fertile ground for invisible fraud. Members trust managers implicitly, and managers, overstretched by operational chaos, rarely scrutinize logs. The fraudster thrives in this trust gap. During training, Summit Consulting, in partnership with the Institute of Forensics & ICT Security often simulates this by setting up a dummy SACCO system. When the teller logs in, a remote monitor records the session. The teller steps away, and “Suspect 1” walks in, authorizes a fake loan, and withdraws funds. No hacking tools. Just an opportunity. To drive the point home, participants are asked to list three roles in their SACCO that are combined in one person. Most realize, with unease, that one staff member has the keys to the kingdom, from cash handling to system administration. The chair problem, unattended desktops A security guard at a SACCO branch once noticed something strange. Every lunchtime, the teller’s computer remained open, with the system logged in. One day, the guard saw a man in a reflective vest, supposedly a maintenance worker, approach the computer, type something quickly, and leave. Later, the SACCO’s system showed that UGX 3 million had been transferred to an account named “Member 112B.” The name didn’t exist in the records. Unattended desktops are the silent epidemic in SACCOs. Staff assume that physical access is protection enough, “after all, who would dare touch my computer?” Yet, anyone with five minutes and a curious mind can reroute funds, alter records, or delete evidence. Fraud today doesn’t require coding skills. It requires patience, observation, and a moment of negligence. As part of Cybersecurity Awareness Month, we made participants watch a live simulation. A staff member logs in and leaves for “a quick errand.” Suspect 1 enters, approves a pending transfer, then logs out. The transaction looks legitimate because it came from a valid session. Thereafter, each participant is asked to role-play the same scenario in their teams. The lesson is clear: a single unattended session can destroy years of trust. Ghost members and internal collusion In 2023, a SACCO in Masaka discovered that 47 “members” in their database were either deceased or non-existent. The records had been created over time by a staff member who recycled data from real ID copies. Loans were processed in the ghosts’ names, approved using colluding officers’ credentials, and withdrawn immediately after disbursement. This is the classic ghost-member fraud. It thrives in environments where oversight is manual and verification is relaxed. Staff exploit the lack of data validation by using relatives’ national IDs or editing one digit of an existing member’s number. The collusion extends upward. Supervisors sign off without cross-checking NIN details or confirming that the supposed member ever visited the SACCO. In many rural branches, loan verification calls are “too costly.” The mobile money trap In the same Masaka SACCO referred to above, every transaction was confirmed via mobile money. Yet, something didn’t add up. Deposits recorded on the MoMo statement didn’t match the SACCO’s ledger. An agent, working with an internal staff member, had perfected the art of double-posting. Here’s how it worked: when a member deposited UGX 500,000, the agent processed the transaction twice. One went through the official channel, while the second was entered manually into the SACCO’s system as a “pending update.” The manual entry inflated balances temporarily, giving the illusion of cash availability. When reconciliation was done, it was brushed off as “system delay.” The fraud continued for months. By the time it was discovered, the SACCO had lost over UGX 40 million. During training, we usually give participants a printed MoMo statement and a system ledger. They must match entries line by line, an eye-opening task that shows how simple reconciliation could have prevented massive loss. Loan approval collusion Every fraud has a timing window. In SACCOs, that window often opens after 5 p.m., when managers leave and systems are “quiet.” That’s when Suspect 1 strikes. Using saved passwords in browser autofill, they log in as the manager, approve a batch of loans, and disburse them before anyone notices. Loan approval fraud is elegant because it hides behind authority. The system records a valid approval under a legitimate account. The next morning, everything appears normal until funds start disappearing. The trick thrives because managers are careless with password security. Many still rely on autofill or share credentials “for convenience.” Yet convenience is the first cousin of catastrophe. While making a cybersecurity presentation to participants are asked to map their SACCO’s loan approval process and highlight every point where one person can act without oversight. The discovery often leads to uncomfortable silence. The printout manipulation trick Fraud in SACCOs often hides not in digital systems, but in paper trails. Receipts, those little pieces of printed proof, can be the biggest deception tools. In a Gulu SACCO, members began to complain that their savings were “missing.” They had receipts showing deposits of UGX 300,000, yet the system reflected UGX 100,000. Upon investigation, Summit found that staff were saving receipts offline as image files, altering the numbers in editing software, and printing them out as genuine receipts. The audit team never cross-verified printed receipts with system-generated ones. They trusted paper more than data. To demonstrate the risk, I usually show

