Cybersecurity awareness month; Day 8 October 2025 issue 8 of 30: Regulators
Your silence is a breach, inside Uganda’s quiet regulatory crisis, fueling cybercrime On the morning of August 22, 2022, a bank manager in downtown Kampala received a call that froze her. Overnight, over UGX 3.6 billion had been siphoned from the bank’s mobile money settlement account. The digital trail led nowhere. The servers had been tampered with, and the audit logs had been wiped clean. The fraudsters had used legitimate system credentials, but from devices that were never registered on the corporate network. It is a standard protocol for banks to report such incidents to regulators. When the bank escalated the breach to its regulator, the reply came three days later: “We are reviewing the incident and will issue guidance.” That silence was all the hackers needed. In those 72 hours, similar attacks hit two microfinance institutions, one telecom, and a payment aggregator. The pattern was identical: insiders colluding with external attackers, exploiting delayed advisories, and vanishing into the fog of digital cash. Please note that, as part of Cybersecurity Awareness Month 2025, we continue to share cases we have handled to create awareness. Some names, facts, and specifics have been changed to protect the identity of our clients as part of our non-disclosure. These cases take a lot of time to compile and write. To support us, attend the cybersecurity and risk management conference, register here>> https://forensicsinstitute.org/ Across Uganda’s fast-digitizing economy, this story is repeating itself; quietly, systematically, and dangerously. The slow response and silence of some regulators is the hacker’s opportunity. The quiet gap that cost billions In the early 2000s, Uganda’s regulatory ecosystem was built around compliance, not cyber resilience. Banks and insurance companies submitted quarterly reports, telecoms filed annual statements, and regulators conducted routine on-site inspections. It worked until the economy went digital. Today, more than 70% of Uganda’s financial transactions move through digital rails: mobile money, online banking, fintech apps, and agent networks. The system is fast, but regulation is still slow. “Regulatory inertia is the new insider threat,” says a cybersecurity expert at Summit Consulting Ltd, which recently led a forensic investigation into a UGX 4.8 billion digital fraud scheme. “When you delay an advisory or a policy response, you’re not being neutral, you’re helping criminals by default.” The expert, who asked to be referred to as Witness 1 to maintain anonymity, described the months-long delay between incident reporting and public disclosure. “Hackers read the same policies we do,” he said. “They know how long it takes for a circular to be approved. They exploit that gap.” How a single memo unleashed a chain of breaches In May 2025, a leaked internal memo from a regulator detailed an upcoming policy on “Secure Cloud Hosting for Financial Institutions.” The memo wasn’t public yet, but insiders knew it would require banks to migrate to approved local data centers by December 2025. Within weeks, two consulting firms began quietly offering “pre-compliance migration” services. Behind one of them was Suspect 2, a former systems administrator turned contractor. His company convinced several mid-sized institutions to move data to cheaper, unverified servers hosted abroad, offshore, untraceable, and vulnerable. By the time the official circular came out, hackers had already gained privileged access to the cloned environments. The forensic audit by Summit Consulting later found logins from Nigeria, Russia, and Gulu, all using valid user credentials. It was an insider-assisted breach born from premature policy leakage and delayed enforcement. “The regulator should have issued an emergency alert the moment that memo leaked,” says Suspect 3, a cybersecurity auditor. “Instead, they waited for a full review. The criminals didn’t wait.” The anatomy of regulatory silence To understand why regulators delay, you must look at how they are structured. Most regulators are designed to prevent corruption and maintain order, not to fight real-time cyberattacks. Their processes reward caution over speed, hierarchy over agility. A single advisory may pass through five desks: legal review, policy, communications, directorate approval, and finally the board. Each stage adds risk of leaks, political editing, and inertia. By the time an advisory reaches the public, it’s already obsolete. Consider this: in 2024, while global regulators were issuing real-time ransomware alerts, a local regulator was still revising its 2020 Information Systems Guidelines. During that same year, an estimated UGX 25 billion was lost to electronic fraud in the banking sector alone. “Silence is not neutrality, it’s negligence,” says a senior executive at a commercial bank who requested anonymity. “We cannot defend systems against attacks we don’t yet know are happening.” The ripple effect of delayed advisories Every delayed advisory creates what cybersecurity experts call a “window of exploitation.” That window, whether a week or a month, becomes the sweet spot for criminals. When a regulator delays announcing a new SIM card verification protocol, syndicates exploit the old one. When they postpone guidelines on agent liquidity, fake agents flourish. When they hesitate to enforce data localization, offshore fraud networks thrive. For example, in Uganda, there is a lacuna in the procurement laws that allows international consulting firms, whether in cybersecurity or other sectors, to operate without being required to first register locally or partner with a Ugandan company. This gap grants such firms unrestricted access to sensitive intellectual property and exposes national systems to significant risks. In one case investigated by Summit Consulting, hackers used unrevised Know-Your-Customer (KYC) rules to register 400 fake SIM cards. Each SIM was linked to a dormant bank account. In a single night, they routed UGX 1.1 billion through those accounts using automated scripts. The regulator issued a public circular, two months later. “By then, the trail was cold,” recalls a forensic investigator. “We found digital breadcrumbs, VPNs from Nairobi, IP jumps through South Africa, then exit nodes in Finland. But the money had already been laundered through crypto wallets.” Not all silence is accidental; some is intentional. Uganda’s regulators often face subtle forms of regulatory capture, when the entities they supervise wield more influence than the regulators themselves. In sectors like telecoms and fintech,
IFIS Cybersecurity Awareness Month, Day 7 October 225 issue 7 of 30 – Insurance Companies
The biggest uninsured risk is your own IT team It began, as most tragedies do, with trust. In a local insurance company, the head of IT had been there for eight years. Loyal, quiet, and efficient. The kind of man who never raised his voice or suspicion. Yet beneath the hum of the servers he managed, a quiet betrayal brewed. Insurance firms love to talk about coverage, floods, fires, and car crashes, but the greatest uninsured risk sits inside their own offices: their IT teams. When fraud happens, people look outward, to hackers, ransomware, or “Russian IPs.” The truth? Eight out of ten breaches in the insurance industry start from within. A clever IT officer with domain access can bury evidence so deep even the best auditors will call it “a system glitch.” Ask yourself right now, who in your organization has access to the system administrator password? If you need to think about it, you are already in danger. “The next breach won’t come from a hacker in Moscow. It will come from the man who fixed your printer last week.” The hidden syndicate inside The insurance IT department is often small, five people, maybe fewer. They eat lunch together, go for coffee together, and sometimes, retire together. That’s how a syndicate forms, not in dark alleys, but in fluorescent-lit server rooms. In one Ugandan insurer, “Suspect 1” and “Suspect 2” perfected the art of invisible fraud. They started by deleting a dormant policy record to test system sensitivity. When no one noticed, they created a fake claim worth UGX 2 million. Just a test. Then another. And another. The pattern was too small to catch. But that’s how syndicates grow, not by greed at once, but by confidence over time. Draw a map of your IT team and claims officers. Who could collude without triggering a system alert? The ghost claim factory This is the dark heart of insider fraud: data manipulation. Using authentic customer data from onboarding systems, the syndicate built “ghost policies”, fake but perfectly formatted. Real agents’ names. Real policy numbers. Real dates. Payments were made to mobile money accounts registered under false IDs. No one noticed because the amounts were small, UGX 300,000 here, UGX 450,000 there. Spread over months, they totaled millions. The fraud didn’t need hacking skills. It needed only access, routine, and a deadened sense of accountability. Activity Pick three random claims from your system today. Verify the identity of each beneficiary beyond the policy file. Do it physically, not digitally. You will be shocked at how many ghosts you’re insuring. “In Uganda’s insurance sector, the most profitable customer may not exist at all.” The “patch update” disguise Every fraud needs a disguise. For insiders, that disguise is maintenance. “We’re applying a patch,” they say. “The system will be down for one hour.” That hour is eternity in digital terms. It’s during these “updates” that configurations are changed, logs are deleted, and backups are quietly replaced. The IT world calls it maintenance. Investigators call it the crime window. When Summit Consulting Ltd investigated one insurer, we found that every fraudulent claim coincided with a “patch update” entry in the maintenance log. That’s no coincidence but camouflage. Review your maintenance schedules. Who authorizes them? Who supervises them? Who reviews logs after? If it’s the same person, that’s the first control failure. Collusion between IT and claims officers Fraud rarely happens in isolation. IT provides access. Claims officers provide the cover story. Together, they build the perfect loop: fake claim, approved payment, deleted evidence. One insurer discovered that its “system crash” reports always followed large claim approvals. When digital forensics reconstructed deleted records, two logins emerged, one from IT, one from Claims, five minutes apart. Coincidence? Not a chance. List all functions in your claim approval chain. Is there a single point where one person can approve, pay, and erase a transaction? If yes, you have already written your own fraud policy. “Fraud is not born in dark rooms. It’s born in relationships of trust, between people who know each other too well.” The mobile money loophole Convenience kills control. Mobile money has become the new frontier for insurance payouts, fast, low-cost, and paperless. But it’s also a paradise for ghost claimants. Fraudsters exploit untraceable SIM cards, splitting payouts across multiple numbers registered under relatives or acquaintances. In one case, investigators found 14 wallets linked to the same device IMEI (International Mobile Equipment Identity). The system checked phone numbers, not devices. Audit your last 100 mobile payouts. Check if any numbers share the same device IMEI or transaction fingerprint. If they do, call the telecom. You’re funding a ghost. The failed segregation of duties Ugandan insurers love to talk about “internal controls.” Yet most IT departments have one person who serves as system admin, database admin, and backup admin. That’s like letting one man hold both the bank keys and the CCTV remote. When Summit Consulting reviewed an insurer’s access matrix, we found one user with privileges to alter claim approvals and purge logs, a digital superuser. The man was on leave. But his credentials were active. Print your IT access list. Count how many people can both approve and delete system data. The number should never exceed one, and even that one should have a watcher. “In cybersecurity, segregation of duties is not a principle but survival.” How red flags were missed Auditors came every quarter, ticked boxes, confirmed that backups existed, verified that reconciliations matched, and never asked how. The losses, about UGX 3.4 billion, were hidden in plain sight across 312 micro-claims. None exceeded the internal audit materiality threshold. That’s how insiders think: below the radar, above the suspicion. Lower your internal audit threshold for random testing. Sometimes the smallest losses reveal the biggest scandals. How the investigators cracked it When the insurer’s new CEO noticed that “fraud recoveries” kept reappearing every quarter, he called Summit Consulting Ltd. The digital forensics trail led us to late-night VPN logins, falsified timestamps, and system
Cybersecurity awareness month; Day 6 October 2025 issue 6 of 30: Universities – how exams are hacked before they are set.
Cybersecurity is now an integrity issue. Academic credibility is one breach away. In June, I wrote to several universities and training institutions urging them to rethink their cybersecurity and examination systems. I got no response. Perhaps they still believe cybersecurity is an IT issue, not an integrity issue. Artificial Intelligence has become the new exam leak. Students no longer smuggle notes; they smuggle algorithms. They do not copy answers; they generate them. The crisis is no longer about cheating; it’s about credibility. AI is quietly eroding the trust in academic qualifications. Employers now ask, “Did this graduate learn or just generate?” The distinction between brilliance and plagiarism is disappearing. Universities that ignore this shift are preparing students for a world that no longer exists. This is why cybersecurity and academic reform must now move together. The next breach won’t come from a hacker; it will come from a chatbot. Join us this Cybersecurity Awareness Month. Attend and sponsor the Cybersecurity and Risk Management Conference on 16th October 2025 at Speke Resort Munyonyo. Because the integrity of your institution is now your strongest firewall. The illusion of academic integrity Most university leaders still believe cheating begins in the exam room, with crib notes, phones under desks, and “helpful” invigilators. That belief is a dangerous illusion. Cheating today starts in the server room. Academic fraud has evolved. The modern cheat doesn’t hide a paper chit; they manipulate digital systems, intercept emails, and buy leaked exams weeks before they are printed. The true battlefield is not in lecture halls but in university data centers, where passwords are reused, systems are unpatched, and “temporary staff” have permanent access. The tragedy is that integrity, once a personal virtue, is now a cybersecurity metric. The lecturer’s password is the university’s moral compass. Every compromised email account can alter a grade, rewrite a transcript, or destroy years of reputation. Universities must understand: their biggest cybersecurity risk is not ransomware. It is academic dishonesty engineered through digital access. The dark campus economy Every university has a silent underworld, an informal marketplace where grades, exam drafts, and coursework circulate like contraband. “Suspect 1,” a lab assistant in one Ugandan university, discovered that information sells faster than airtime. His part-time job maintaining the exam submission portal gave him privileged access to uploaded papers. With a single copied folder, he could earn what his salary took three months to provide. In closed WhatsApp and Telegram groups labeled “revision materials,” leaked exams trade hands for mobile money or data bundles. Students call it “help.” Staff call it “hustle.” The system calls it “a breach.” Academic corruption now has a payment gateway. Mobile money becomes the bloodstream of digital dishonesty, leaving behind transaction trails that only a skilled forensic auditor can trace. When Summit Consulting investigated a similar case, patterns emerged: suspicious deposits before exam weeks, identical IP addresses linking students to staff, and “midnight logins” that no one could explain. The dark campus economy is not fiction. It’s the new tuition business. How exams are hacked before they are set Many institutions assume exams leak after being printed. In reality, most leaks happen before they are even finalized. The exam-setting process is a chain of trust, and every link is fragile. Papers are typed, emailed to subject heads, reviewed, printed, sealed, and stored. Along this chain, a careless click or misplaced flash drive can expose the entire exam. “Suspect 2,” a part-time IT technician, was responsible for server backups. One evening, he made a copy of the exam folder “to ensure redundancy.” That duplicate, saved on his personal hard drive, was later accessed by students preparing for the same course. The breach didn’t involve a hacker from Russia. It involved a careless human in Kampala. Universities often rely on shared folders, unencrypted drives, and unsecured emails. Exams are stored in the same location as coursework files, making them easy targets for anyone with modest digital curiosity. Cybersecurity is not about firewalls, but about process design. The moment an exam touches a personal device, it is already public. The password paradox In one investigation, Summit Consulting found over 60% of university staff using passwords like Admin123, Welcome2020, or their own names. These are not passwords. They are resignation letters waiting to be signed. The irony is that many lecturers teach information systems but still reuse personal passwords across university platforms. A single compromised Gmail account can open the student record system, the marks portal, and the HR database. Cybercriminals do not need to hack; they just need to guess. And they often guess right. When asked why they don’t change passwords, one senior academic replied, “Because I forget them.” That excuse cost the institution its entire assessment database when a stolen password was used to download exam drafts. In cybersecurity, laziness is a luxury no university can afford. Passwords protect more than data, they protect dignity. Every institution must treat password hygiene as a cultural reform, not a technical policy. The invisible insider The most dangerous hacker on campus already has an ID card. Insiders know the systems, processes, and blind spots. They understand how marks are uploaded, who approves results, and when systems are most vulnerable. “Suspect 3,” a data entry clerk, used their access to alter grades in exchange for “transport.” Access control existed, but accountability didn’t. Multiple users shared login credentials “for convenience.” When the marks changed mysteriously, no one could tell whose account was used. This is how academic manipulation thrives: through shared passwords, weak segregation of duties, and absence of digital logs. Internal fraud rarely looks like a crime; it looks like teamwork. Universities must rethink access. Not everyone who touches data needs to edit it. Not everyone who edits data needs to approve it. And not everyone who approves should do so without a second set of eyes. The invisible insider is not a tech problem. It’s a governance problem disguised as convenience. The anatomy of an academic breach Picture this: a student connects to campus
Cybersecurity awareness month; Day 3, October 2025 issue 3 of 30: Fraud in NGOs
“Your NGO is not losing money through sacks of maize. It is losing it through megabytes of data.” Most leaders still picture fraud in physical form, missing fuel, fake receipts, ghost beneficiaries, or warehouses half-empty. Yet today’s fraudster doesn’t touch a truck. He touches a keyboard. Donor dollars vanish quietly, with the click of an “approve” button, authorized by login credentials stolen from the very staff you trust. NOTICE Effective 1st October 2025, the technical training arm of Summit Consulting Ltd, the Institute of Forensics and ICT Security (IFIS), has taken the lead as Uganda’s global champion for Cybersecurity Awareness Month. To mark this, we are sharing powerful cybersecurity insights across all our newsletters, bringing cybersecurity to the very center of governance and leadership conversations. And we are not stopping there. You and your team can now register for a free virtual Cybersecurity Awareness Session worth UGX 5 million, offered at no cost as part of our global Cybersecurity awareness. Simply visit https://event.forensicsinstitute.org/ to secure your slot. For organizations that prefer in-person training, IFIS is offering on-site sessions at your place of work at a facilitation fee of only UGX 500,000 per team, per session. Do not gamble with silence. Invest in awareness before a breach forces you to pay in panic. Register today. The danger is simple: corruption has gone digital, but leadership is still analog. Boards debate procurement policies while ignoring who controls the server. EXCOs argue over per diem ceilings while the finance portal has no two-factor authentication. Leaders obsess about visibility in the field but remain blind to what happens on the network. The greatest fraud in NGOs today is not collusion between procurement and stores. It is a collusion between IT and Finance. Why? Because donor funds move electronically, via SWIFT, mobile money, or internal transfers. A single insider can reroute funds to a ghost service provider with no warehouse trail, no physical audit, and no drama. Other common ways NGOs lose money include: Ghost beneficiaries. Hundreds of “recipients” with AI-generated photos and phone numbers. Mobile wallets opened, funds disbursed, activity logged. Paper trail? Fabricated. Fake invoices. Suppliers that exist on paper only. Scanned invoices, doctored purchase orders, and bank transfers to shell accounts. Inflated procurements. Legitimate vendors collude with program staff to inflate prices; the surplus is siphoned off in cash. Payroll ghosts. Phantom staff on the payroll; advances “reconciled” with forged signatures. Consultant capture. Phantom consultants submit glossy reports; payments are made up-front, and no deliverables are delivered. Cash corridors. Field cash disbursements “recorded” with receipts that match no serials; collectors disappear. Collusion with IT. Low-privilege staff climb privileges; approval workflows are bypassed or backdated. AI-assisted deception. Deepfakes for IDs, synthetic invoices, convincingly forged emails that pass basic verification. “Donor dollars are not stolen in sacks. They leak in megabytes.” Hope is not an audit trail. Boards that obsess over petty receipts while ignoring digital controls are writing their own obituary. Defence is simple in principle, hard in execution: strong segregation of duties, real-time anomaly detection (yes, AI can help), mobile money reconciliations matched to KYC, mandatory forensic spot checks, and a board-level cyber posture that treats prevention as governance. The Cyber Leakage assessment tool This is a practical governance tool every NGO EXCO should adopt. It tracks four dimensions: Access control: who has the keys to your financial system? Transaction monitoring: Are donor dollars matched to verified beneficiaries? Data visibility: Can leadership see anomalies in real time, or only in quarterly reports? Incident readiness: When, not if, a breach occurs, do you have a tested response plan? Today’s activity As part of this month, run a boardroom simulation. Give your managers a scenario: “A hacker has cloned your NGO’s domain email. Donors have received fake invoices with your logo. How do you detect, respond, and reassure donors within 48 hours?” The discussion will reveal gaps in awareness and controls faster than any report. “Donor confidence is not lost when you are hacked. It is lost when you have no credible answer”, Mr Strategy. Cybersecurity is no longer about IT departments. It is about leadership survival. If you run an NGO, know this: the future of donor trust lies not in your warehouses but in your WiFi. Join us this October. Cybersecurity Awareness Month is your chance to shield your NGO from the fraud you cannot see, but cannot afford to ignore. Board Briefing on Cyber Leakage Readiness vs Target Donor dollars are stolen in MBs, not in sacks. Chart 2 below shows your NGO’s cyber leakage readiness against the global best practice target (scale 0–5). While the board routinely debates fuel theft or procurement receipts, the real leakage is invisible, happening in data flows, weak systems, and insider collusion. Chart 2: Cyber Leakage Readiness vs Target Cyber Leakage Readiness vs Target (Sample data). Access Control: 5 vs 5 Transaction Monitoring: 0 vs 5 Data Visibility: 8 vs 5 Incident Readiness: 5 vs 5 Key implications for the board There is a trust gap with donors. They assume their money is cyber-secure. Weak scores show a silent erosion of trust. Once lost, donor confidence rarely returns. Clear oversight blind spots are visible. Boards focus on warehouse audits, yet digital leakage bypasses those checks. Cyber fraud today requires governance, not just finance oversight. At 5 readiness, the NGO is effectively unprepared to respond to a cyber breach. A single spoofed email or fake invoice could cripple operations. This leads to incident paralysis that must be addressed immediately. The collusion risk at scale. The weakest links are Transaction Monitoring and Incident Readiness. This is where IT and Finance collude undetected, because the board lacks visibility. Red flags to look out for Staff sharing logins to financial systems. Email approvals without multi-factor authentication. No real-time visibility of funds-to-beneficiary matching. The incident response plan is either untested or non-existent. Action steps for EXCO and board Mandate Cyber Leakage Radar reporting – Require quarterly board dashboards on the four dimensions. Close the readiness gap – Prioritize
Cybersecurity awareness month; 2nd October 2025 issue 2 of 30
Your core banking system is not your biggest risk. Your tellers are. The wheelbarrow mentality is a condition where staff wait to be pushed to do what they are already paid to do. In banking, this mentality is lethal. A teller who logs into the core system with one hand, while texting a cousin on mobile money with the other, is not just inefficient; he is your greatest cyber liability. CEOs love to boast about multimillion-dollar core banking upgrades. “We bought the latest system,” they say, as if code could cure culture. It cannot. A teller with a smartphone can reroute millions faster than your IT firewall can blink. Hackers don’t need to break through your perimeter when the insiders open the gates daily. Every fraud story I have investigated begins the same way: with misplaced trust in “loyal” staff. The weak passwords are written on sticky notes. The workstation was left unlocked for tea. The supervisor who signs off without verifying. Banks do not lose billions through Russian hackers; they lose them through inattentive, underpaid, or compromised insiders. It’s a paradox. The more technology you deploy, the more human discipline you require. Yet most boards spend 90% of their budgets on systems, and less than 5% on building a security-aware culture. That is like buying a bulletproof car but hiring a reckless driver. “Cybersecurity is not about technology. It is about trust. And trust, once broken, is uninsurable.” If you are a CEO, your greatest cyber risk does not sit in Moscow or Lagos. It sits in your banking hall, smiling, stamping, and waiting for a chance to strike. Audit culture as aggressively as you audit systems. Train, monitor, and enforce discipline daily. Technology is only as strong as the teller who uses it. Stay safe. Most leaders talk about cyber as if it were an IT line item. That is why they lose. Hackers don’t attack firewalls; they exploit governance gaps. The weakest control is often not the system; it is the boardroom silence. To win, directors need a simple but ruthless tool that cuts through jargon and exposes blind spots. Enter the Cyber Risk Radar™: a one-page governance weapon that forces the right questions, demands evidence, and shows instantly whether your organization is drifting toward breach or building resilience. This is not a checklist for IT. It is a mirror for the board. Table 1: The board’s Cyber Risk Radar Dimension Board question to ask Evidence required What “weak” looks like What “strong” looks like Board action 1. Insider threat exposure If a teller left their desk unlocked, how much could we lose before detection? Data on maximum exposure per workstation, incident logs No monitoring; staff share logins; no transaction caps Real-time monitoring; auto-logouts; transaction caps per user Demand simulation results; insist on quarterly insider threat testing 2. Cyber red flag dashboard Do we have a one-page quarterly dashboard? Dashboard showing logins, insider breaches, near-misses, and financial exposure IT jargon slides, no metrics linked to money Clear numbers tied to financial risk and trends Require dashboard as a standing board pack item 3. Executive accountability Whose bonus is reduced if we suffer a breach? HR policy linking EXCO pay to cyber incidents “Cyber is IT’s problem.” CRO, CIO, and COO have performance-linked accountability Direct RemCo to tie pay to cyber outcomes 4. Business continuity drill What happens if the system goes down for 1 hour? Documented BCP/DRP test results, staff performance logs Panic; no plan; reliance on IT improvisation Blackout drill executed; operations continue via backups/manual fallback Order an annual “blackout drill” with the board observing the results 5. Board cyber maturity score How do we rate ourselves: ignorant, informed, or intelligent? Independent maturity assessment, board training records Board waits for IT updates; no training Board challenges assumptions, links cyber to strategy, demands controls Schedule quarterly board self-assessment and annual cyber training How to use this table in practice Insert it into every quarterly board pack. Score yourselves honestly on each dimension (1 = weak, 5 = strong). Track movement quarter by quarter. If you’re not moving up, you’re drifting into irrelevance. “Cybersecurity is not a technical war. It is a governance war. And boards lose by silence.” Mr Strategy. About the IFIS, https://forensicsinstitute.org/about/ At IFIS, we live by our motto, “Discere Faciendo. Learn by Doing.” Every course, certification, and training session emphasizes practical, hands-on skills that empower you to solve real-world challenges from day one. Learn by doing, be empowered to transform your career and life. At the Institute of Forensics & ICT Security (IFIS), we specialize in bridging the gap between knowledge and application. Whether you are navigating the challenges of cybersecurity, mitigating enterprise risks, investigating fraud, or analyzing complex data, our cutting-edge certifications and practical training programs prepare you to lead in today’s dynamic world. Come and get skills that you can apply to your job instantly and transform your career and life. Copyright IFIS 2025. All rights reserved.
Cybersecurity Month 2025 is here–Stay aware, stay protected
October is here. Across Uganda, thousands of employees will walk into offices, open emails, and click without thinking. One careless click is all it takes. That is why Cybersecurity Awareness Month is not just another “theme month.” It is a survival drill. Cybercrime is no longer a distant story from America or Europe. It is a Ugandan reality. From SACCOs in Masaka losing millions via mobile money SIM swaps, to hospitals in Kampala locked out of patient records by ransomware, to government agencies paying ransom quietly, cyber risk is here. It is local. It is expensive. And it is growing. Why awareness matters The biggest myth in cybersecurity is that technology alone will protect you. Firewalls, antiviruses, and fancy dashboards mean nothing if your people are blind to threats. Eight out of ten breaches in Uganda begin with human error: a staff member reusing passwords, downloading fake invoices, or sharing sensitive data on WhatsApp. Cybersecurity Awareness Month exists to break that ignorance. To remind every staff member that they are the first firewall. The cost of ignorance A Tier 2 bank lost UGX 1.4 billion in a single phishing campaign. A private school had its entire website defaced, leaving parents questioning its credibility. An NGO donor froze funding after hackers exposed project data. None of these began with a “major hack.” They began with an ignored awareness. What must leaders do this October? Make cybersecurity cultural, not seasonal – One month of slogans is not enough. Embed cyber habits into daily work. Invest in drills, not posters – Staff remember simulated phishing tests, not motivational banners. Hold EXCO accountable – Cyber risk is a governance issue. Boards must demand evidence of preparedness, not promises. At Summit Consulting Ltd, we say: “Cybersecurity is not IT. It is a survival strategy.” This month, we are running free awareness trainings for organizations that dare to take risk seriously. One hour with us could save your organization billions. The question is not whether hackers will strike. It is whether your team will recognize the attack when it happens. Stay aware. Stay protected. Visit https://event.forensicsinstitute.org/ to access free resources. Be safe online. Cybersecurity Month 2025 is here: Stay aware. Stay protected. One careless click. That’s all it takes to lose millions. Cybercrime is no longer a foreign headline; it is Ugandan. A SACCO in Masaka was wiped out via SIM swaps. A Kampala hospital was locked out of patient records by ransomware. A Tier 2 bank is losing UGX 1.4 billion to phishing. None of these started with “big hacks.” They started with ignorance. The truth? Technology won’t save you if your people are blind. Eight out of ten breaches in Uganda begin with human error, weak passwords, fake invoices, or sharing data on WhatsApp. This October, the Institute of Forensics & ICT Security, a technical training arm of Summit Consulting, is leading Cybersecurity Awareness Month. We are offering free awareness training to organizations ready to treat cyber risk as a governance issue, not an IT problem. Boards, EXCOs, and CEOs: stop asking “What if hackers strike?” Start asking: “Will my team recognize the attack when it comes?” Register your team today. Save your reputation tomorrow. events.forensicsinstitute.org Stay aware. Stay protected.
Small risks, big consequences: Why details matter
UGX 46 million vanished in less than an hour. Not through a sophisticated hack. Not through some dark web syndicate. But because one junior accountant at a mid-sized SACCO forgot to press Ctrl+Alt+Del before stepping out for tea. That’s it. One forgotten click. While managers were busy preparing for their weekly meeting and members queued outside to deposit their hard-earned savings, Suspect 1—a teller with sharp instincts for weakness- seized the moment. He slid into her still-active workstation, typed nothing, and yet unlocked everything. Within minutes, he rerouted UGX 76 million through three mobile money accounts. By the time the IT team noticed “irregular logins,” the cash had already been withdrawn in brown envelopes from Kikuubo agents. This wasn’t a cyber genius at work. It was a crime of opportunity, powered by complacency. Here’s the bitter truth: fraud in Uganda rarely starts with billion-shilling heists or complex malware. It starts with tiny details everyone dismisses. A door left ajar. A system is left logged in. A control left unenforced. And because leaders gamble that “nothing big will happen,” something massive always does. This SACCO didn’t lose because it lacked technology. It lost because it lacked discipline. The myth of “small” risks Many leaders are obsessed with the big stuff: market share, regulatory approvals, new loans, donor inflows. Yet it is the “small” things that cripple organizations. A missed reconciliation of UGX 200,000 in petty cash. A supplier contract is missing one clause on delivery timelines. An unpatched firewall is ignored because “IT is busy.” A guard sleeping outside the warehouse, “just one night.” Each feels negligible in isolation. But in practice, small risks are never small. They are the loose threads that unravel the entire fabric. The details that destroyed giants The ghost fuel deliveries – A transport company ignored “minor” discrepancies in trip sheets. Over time, those “few missing litres” added up to UGX 1.2 billion siphoned off by colluding drivers and pump attendants. The board only woke up when clients began to terminate contracts. The weak password tragedy – A private university IT officer reused the same password across systems. Hackers cracked it within minutes. What began as a “small vulnerability” led to the leak of student data, lawsuits, and millions in damages. The fake signature scandal – A government project officer approved “small” field expenses without verifying signatures. For two years, fictitious names collected allowances. By the time Summit Consulting was hired, UGX 3.7 billion had evaporated. The pattern is clear: small risks ignored turn into scandals that cripple. Why leaders dismiss details The psychology is simple: details feel boring, beneath senior executives. Boards like grand narratives, not red flags about missing invoices. CEOs prefer PowerPoint on expansion strategies, not notes on untrained security guards. But risk thrives in the margins, not in the headlines. Remember: The Titanic wasn’t sunk by a fleet of icebergs. It hit one small detail, an iceberg tip nobody thought mattered. Take Suspect 2, a procurement officer in a local hospital. She began by “borrowing” UGX 100,000 from supplier refunds. Nobody noticed. Encouraged, she increased to UGX 500,000, then UGX 2 million. By year three, she had rerouted over UGX 600 million. When caught, her defence was chilling: “If they ignored the small things, why wouldn’t I keep going?” Fraud rarely begins with billions. It begins with overlooked details. The red flags good investigators look for When we investigate fraud, we don’t start with the “big scandal.” We start with the details: Expense claims are repeatedly just below approval thresholds. Staff who never take leave (afraid their fraud will be discovered). Delayed reconciliations were excused as “system issues.” IT logs showing after-hours access that nobody questions. Petty cash never balances to the last shilling. Each is a whisper of a coming storm. Ignore them, and you invite catastrophe. Why do details matter? Culture – A culture that ignores details creates silent permission for fraud. If bosses laugh off small control breaches, staff take it as a green light. Compounding effect – UGX 100,000 stolen weekly becomes UGX 5.2 million annually. Over five years, it’s UGX 26 million. By then, the fraudster has graduated to bigger schemes. Regulatory cost – Donors and regulators don’t care whether theft began “small.” They penalize based on total loss. And in Uganda, reputational damage is instant and unforgiving. Lessons for leaders Interrogate the details – Ask about small variances, small delays, small exceptions. That’s where truth hides. Reward vigilance, not speed – A staff member who takes extra minutes to cross-check signatures is more valuable than one who rushes. Automate the boring stuff – Use fraud analytics and dashboards. Machines don’t get bored by details; humans do. Hold managers accountable for the “small stuff” – Don’t let senior leaders hide behind strategy slides. Make them answer for reconciliations, leave rosters, and password policies. At Summit, we tell clients: “Ignore the decimal point, lose the whole figure.” Every investigation we’ve cracked, whether UGX 80 million or UGX 8 billion, started with small anomalies someone dismissed. Our forensic accountants don’t chase headlines. They chase details. That’s how we catch the ghosts. The devil lives in the details Small risks are never small. They are termites chewing silently at the foundation. They rarely shout, but they always multiply. And by the time leadership notices, the cost is catastrophic. The riskiest leaders in Uganda today are not the ones who gamble boldly. They are the ones who ignore details, shrugging off small risks as “minor.” The next fraud in your organization won’t start with UGX 1 billion. It will start with UGX 100,000; nobody cares about it. The question is, who is watching the details? Copyright IFIS 2025. All rights reserved.
Why ignoring risk is the riskiest move of all
Unthinkable. It happened at 11:48 p.m. in a private hospital in Kampala. The lights were still on, but the hospital’s heartbeat, the patchwork of digital and manual systems holding it together, flatlined. The pharmacy system froze. The laboratory printer jammed mid-test. The mobile money integration for patient payments collapsed. Even the old desktop server that held patient histories blinked into darkness. Nurses in the ICU reached for files that weren’t there. The paper charts had long been replaced with a “digital records upgrade” that now lay hostage to a system crash. In the theatre, a surgeon barked: “Get me the blood group!” But the lab technicians stood helpless; the results were trapped in the system. An intern sprinted down the corridor, searching for handwritten notes. A nurse fumbled through a drawer with loose papers, praying for a clue. Relatives pressed against the glass windows, panicked, whispering, “What if someone dies?” One mother clutched her rosary, eyes fixed on the ICU where her child lay on a ventilator. The machines still beeped, but no one could pull up the latest dosage records. What began as a “minor systems maintenance” had spiraled into a night of terror. By morning, the CEO arrived, sweating, shaken, summoned by a flurry of midnight calls from doctors and the board chair. His first question was blunt: “How did this happen?” The bitter truth is that it hadn’t “just happened.” For months, the IT officer had raised red flags. The hospital was running on outdated software, free antivirus, and a third-party backup service that had never been tested. Internal audit had flagged the risks, filing memos no one read. Leadership, eager to look modern with a “digital hospital” brand, had gambled that prevention was too costly. They were wrong. By the time the systems limped back seven hours later, two scheduled surgeries had been postponed. One patient’s transfer to the ICU had been delayed because payment could not be confirmed. The pharmacy issued wrong doses due to a lack of updated stock records. Trust shattered. Word spread across WhatsApp groups: “Don’t go there, they nearly killed people last night.” The financial cost ran into hundreds of millions. But the reputational damage was unquantifiable. In Uganda, where hospitals live or die by word of mouth, this was lethal. Families spoke of negligence. Journalists sniffed for a story. Regulators circled like vultures. It wasn’t just a system crash but a mirror held up to leadership blindness, choosing optimism over action, brand over backbone. And in those seven hours, lives dangled on the edge because someone thought silence was cheaper than prevention. The illusion of safety Leaders often mistake silence for safety. Because no disaster is visible today, they assume tomorrow will be the same. But risk is like termites in timber. It eats silently, invisibly, until one day the entire roof caves in during a storm. Think about it. How many Ugandan companies waited until fraud broke headlines before they strengthened controls? How many universities ignored student unrest until a protest burned down offices? How many hospitals shrugged off weak fire systems until lives were lost? Ignoring risk is not risk avoidance. It is deferred suicide. The psychology of ignoring risk Why do smart executives act dumb when it comes to risk? Three reasons stand out: Optimism bias – “We have never had a major fraud before, so why should it start now?” That is the reasoning of a chicken celebrating Christmas Eve because the farmer hasn’t slaughtered it yet. Short-termism – Many executives are rewarded for quick wins, not for preventing invisible disasters. Why spend UGX 200 million on cybersecurity when you can buy new cars for management and show “progress”? Fear of bad news – Some boards treat risk officers like prophets of doom. Raise too many alarms, and you’re branded “negative.” So auditors soften language, executives sugarcoat reports, and directors sleep through board packs. Until the wake-up call arrives at 2 am. The hidden cost of ignored risk The cost of ignoring risk is never obvious on day one. It accumulates quietly. Banks that ignore credit concentration wake up to billions locked in real estate loans when the sector crashes. NGOs that ignore whistleblower reports discover 30% of project funds siphoned off by “ghost beneficiaries.” Manufacturers that ignore machinery maintenance see production halt when a single bearing breaks. Government agencies that ignore data security pay ransom in Bitcoin to hackers hiding in Moscow or Nairobi. Risk is a tax collector. It never forgets, never forgives, and always charges compound interest. A case in point I’ll share three anonymized cases from my investigations with Summit Consulting Ltd: Case of the vanishing payroll – In 2023, a government parastatal ignored audit flags about payroll irregularities. “We’ll deal with it next quarter,” said the HR director. By the time Summit was called in, over UGX 4.8 billion had been siphoned into mobile money wallets linked to ghost employees. The red flags were visible for two years. They were simply ignored. Case of the locked warehouse – A local FMCG company ignored repeated risk reports about inventory mismatches. One weekend, staff arrived to find the warehouse padlocked, not by management, but by suppliers owed millions. Goods worth UGX 2.3 billion were trapped inside. That “minor reconciliation issue” turned into a company-wide crisis. Case of the paralyzed hospital – A private hospital ignored cybersecurity warnings, assuming “hackers only target big banks.” In June 2024, ransomware locked patient records. Doctors could not access lab results or prescriptions for 48 hours. Two patients died. Losses? Beyond money, reputation, trust, and human lives. Each case shows the same pattern: warnings existed, but leaders chose inaction. Why ignoring risk is leadership failure The board’s role is not to cheer quarterly profits but to protect long-term survival. Ignoring risk is leadership malpractice. It signals three weaknesses: Poor governance – When boards don’t challenge management, blind spots become black holes. Weak culture – Organizations that punish whistleblowers and auditors cultivate silence, not vigilance. Complacency – Success

