A cybersecurity case leaders rarely want to hear The breach did not come through the firewall. It came through a smile. On paper, the organisation was doing everything right. Firewalls patched. Antivirus updated, and external penetration tests passed. The board slept well, believing the risk lived “outside”; hackers in hoodies, foreign IP addresses, dark web threats. The reality was closer. Much closer. The breach began with a familiar face. A long-serving systems administrator. Ten years in. Trusted and dependable. Always available when systems went down at night. The kind of person whose access requests were approved without hesitation. That is how most serious cyber incidents begin. Familiarity is not trust. It is exposure. In cybersecurity, the biggest threat is not malicious intent. It is unquestioned access. The administrator did not set out to steal data. That is important. He was under pressure: personal debt, school fees, side hustles. He reused credentials across systems “to save time.” He disabled certain logs because “they slowed the system.” He shared admin passwords informally with a colleague during a crisis and never rotated them back. None of this triggered alarms. Why would it? He was one of “us.” Then a phishing email landed in his inbox. Not dramatic. Not sophisticated. It referenced an internal system upgrade and used language copied from previous internal emails. He clicked. The attacker did not need to break in. They were invited. Within hours, lateral movement had begun. Privileged access meant the attacker could see everything: user directories, financial systems, and backups. The breach went undetected for weeks because the activity looked normal. It was executed using a legitimate account, at normal hours, by someone who had always been there. This is the part boards struggle with: cybersecurity fails socially before it fails technically. Why familiar faces are the hardest risks to see First, they blend into noise. Security teams are trained to look for anomalies. Familiar users generate none. Their behaviour becomes the baseline. Second, leaders override controls for them. “He needs quick access.” “She has been here for years.” Temporary exceptions accumulate. Cyber risk compounds quietly through kindness. Third, reporting lines blur. When someone is both critical to operations and deeply trusted, no one wants to challenge them. Reviews become ceremonial. Access recertification becomes box-ticking. I have seen this pattern across banks, universities, and government agencies. The longest-serving staff often carry the widest, least-reviewed access. Not because they are bad, but because no one ever went back to redesign the system around growth. The red flags that were missed In this case, the signs were there. System logs were thinner than expected. Privileged access had never been reduced despite role changes. Backups were accessible from production accounts. Alerts were configured, but no one reviewed them regularly. Most tellingly, cybersecurity was still framed as an IT issue. The board asked about tools, not behaviours. Budgets were approved for software, not for access governance or insider threat modelling. The breach was discovered only after customer data appeared on a forum. By then, the question was no longer “how did this happen?” but “why didn’t we see it coming?” The lesson for leaders Cybersecurity maturity is measured by how you manage the people you trust most. If your most familiar faces have never had their access challenged, you are exposed. If your cybersecurity dashboards never discuss insider risk, you are blind. If your board is comfortable, your organisation is not safe. This is not an argument for suspicion. It is an argument for discipline. Trust should trigger stronger controls, not weaker ones. What boards and executives must do differently? Reframe cybersecurity as an organisational risk, not a technical one. Demand regular privileged access reviews led independently of IT. Rotate credentials ruthlessly. Separate operational heroism from control design. Most importantly, ask one contrarian question at the board level: “If I wanted to steal from us quietly, who would I need to be?” The answer is rarely a stranger. Cybersecurity does not fail because leaders do not care. It fails because they care selectively. And familiarity is the most dangerous selection bias of all.
The night the numbers stopped making sense
It happened quietly, and that is how most frauds begin in Uganda; not with alarms, but with silence. On a Tuesday evening in late August, long after the office had emptied and boda bodas thinned on the road, a junior accountant stayed behind at a mid-sized institution on the outskirts of Kampala. The reason sounded noble: month-end reconciliations. The reality was darker. By the time the security guard locked the gate, UGX 1.48 billion had already begun its journey out of the organization. No gun, no hacking, no drama, just a series of approvals that looked routine, numbers that felt familiar, and people who trusted each other a little too much. This was not a failure of intelligence. It was a failure of discipline. I know this pattern well. I have walked into too many boardrooms where leaders say, “We trusted our people,” as if trust were a control. It is not. Trust is a sentiment. Controls are systems. The internal conflict that opened the door Every serious fraud starts with tension, not greed. Suspect 1 was competent, respected, and exhausted. A mid-career professional, sharp suit, soft-spoken, known for “saving the day” during audits. But for two years, his promotion had stalled. New managers arrived. Younger. Louder. Less experienced. Suspect 2 was different. Charismatic. A fixer. He knew everyone, from mobile money agents to suppliers to the cashier at the bank branch. He thrived in the informal spaces where rules bend. Their conflict was not personal. It was structural. The organisation had grown, but its controls had not. Roles overlapped. Segregation of duties existed on paper, not in practice—management prized speed over process. “Just make it work” had become the unofficial strategy. That is where fraud feeds. How the scheme was engineered This was not sophisticated. That is what makes it dangerous. The scheme had three moving parts. First, dormant supplier accounts. Over the years, dozens of suppliers had been onboarded for projects that no longer existed. Their profiles were never deactivated. Bank details sat quietly in the system, untouched but alive. Second, manual overrides. The finance system allowed senior staff to bypass certain approval thresholds “temporarily.” Temporary, in Uganda, often means forever. Third, mobile money as the bridge. Instead of moving funds directly to personal accounts, which would raise flags, money was routed through supplier accounts, then broken down into smaller mobile money transfers; UGX 20 million here, UGX 15 million there. Familiar amounts. Normal-looking flows. Suspect 1 processed the entries. Suspect 2 handled the distribution. A third, never formally identified, provided cover by delaying reconciliations. No single transaction looked suspicious. That is the genius of bad systems. The money trail no one wanted to follow When Summit Consulting Ltd was called in, the brief was simple: “Just help us confirm the numbers.” That sentence always worries me. We started where most internal teams avoid: timing. Not amounts. Timing. Why were supplier payments peaking on Fridays? Why did mobile money transfers spike between 6:30 pm and 8:00 pm? Why are reconciliations always postponed to “next week”? We mapped three months of transactions against staff attendance, system logs, and mobile money statements. Patterns emerged quickly. Supplier A received UGX 320 million over six weeks, for services last rendered three years ago. Supplier B’s bank account showed immediate cash withdrawals, followed by mobile money deposits to numbers registered under different names but sharing the same national ID photo. This is Uganda. Paper lies. Patterns don’t. The red flags the auditor finally trusted The turning point came from an auditor who almost ignored his instinct. He noticed something small: rounding. Payments consistently ended in double zeros. Real operational payments are messy. They include odd figures, fuel adjustments, tax differences, and decimals. Fraud likes clean numbers. Then came the lifestyle lag. No flashy cars. No mansions. Instead, school fees paid in cash. Loans settled early. Quiet generosity. Fraudsters in Uganda often hide by being modest. Most importantly, there was fear. Staff avoided certain questions. Files went missing, meetings were postponed, silence thickened, and fear is the loudest red flag. How internal controls were bypassed Let us be clear. The controls did not fail. They were never real. Segregation of duties existed, but the same people covered for each other during “busy periods.” User access reviews were performed annually, long after damage was done. The board received dashboards, not discomfort. No one asked the most important question: “Show me how this could be abused.” I say this from experience. Boards prefer assurance. Fraud thrives on reassurance. The moment the case cracked The case broke not through technology, but through conversation. A junior staff member, quiet and observant, mentioned casually that Suspect 2 always knew when funds would hit certain accounts, even before system notifications went out. That is insider knowledge. That is coordination. We cross-checked call logs against transaction timestamps. Correlation was near-perfect. At that point, the numbers no longer argued. They confessed. The cost, counted honestly The total loss stood at UGX 1,482,600,000. Recoverable? Partially. Some funds had been converted to cash. Some invested informally. Some gone. But the real loss was trust between staff, management, and the board. And trust, once broken, costs more to rebuild than any balance sheet can show. What leaders must confront? Fraud is rarely about bad people. It is about lazy systems, unclear accountability, and leaders who confuse activity with control. If your best defence is “we trust our people,” you are already exposed. If reconciliations can wait, so can fraud detection. If no one is uncomfortable in your boardroom, someone is comfortable stealing. I have carried coffins in Munteme village. I have watched savings groups collapse because no one wanted to ask hard questions. The lesson is the same at every level: darkness is not evil. It is merely unattended. Fraud does not announce itself. It waits for permission. And permission is often silent.
Season’s greetings from all of us at Summit Consulting Ltd, Institute of Forensics and Twezimbe.com
As the year draws to a close, we pause with gratitude. Thank you for the trust you placed in us, the conversations that mattered, and the courage to confront hard decisions together. In a world that rewards noise, you chose clarity. In moments that demanded comfort, you chose discipline. That is leadership, and it has been a privilege to walk alongside you. Christmas reminds us that progress is built quietly, through values, relationships, and long-term thinking. The New Year calls us to sharpen our judgment, strengthen execution, and lead with purpose in a more demanding world. From all of us at Summit Consulting, we wish you a peaceful Christmas and a bold, successful New Year filled with sound decisions, resilient systems, and enduring impact. Please note that our office will close on 22nd December 2025 and reopen on 5th January 2025. We shall leave the office closed and work from home for key roles. We look forward to continuing the journey with you in the year ahead. Warm regards, All of us at Summit Consulting 2025 in review at Summit Consulting Ltd 2025 was a defining year for Summit Consulting. A year of building, loss, resilience, and quiet progress that will shape the next decade of our work. We reached a major milestone with Twezimbe.com. After two years of disciplined effort, the platform is now live. You can now register and automate your benevolent fund or run a crowdfunding campaign for your Church or Mosque project. Have a club? You can automate it and easily collect membership fees. What started as a conviction is now a working system, built to formal standards, designed for scale, and grounded in real community and institutional needs. It is a reminder that serious products are forged through patience, not publicity. We also took governance to a higher level with Mela GRC. The platform replaced our old tool, actionTEAM GRC, and moved governance, risk, and execution from fragmented tools to one integrated operating system. Boards and executives can now see, decide, and act with clarity. This was not an upgrade. It was a reset of how governance should work in practice. Through IFIS, we strengthened the professional ecosystem. The 3rd Annual IFIS Conference was successfully delivered, convening practitioners, leaders, and regulators around real-world challenges in cybersecurity, investigations, and risk. The conversations were honest, practical, and forward-looking. 2025 was also marked by deep loss. We lost our Chief Operations Officer, Godfrey Ssenyonjo. A colleague, a professional, and a steady presence in our journey. His passing left a silence that cannot be filled. We continue to pray for his soul and for the young family he left behind. His contribution, character, and commitment remain part of Summit’s story. We end the year grateful, grounded, and clear-eyed. Stronger in systems. Deeper in purpose. More human in perspective. That is 2025 at Summit Consulting. Copyright Summit Consulting Ltd 2025. All rights reserved.
Turn screen time into skill time: Enroll your child in cybersecurity this holiday
I receive many cases, and some stand out. During a school holiday two years ago, a parent complained that their child was “always on the phone.” Games. YouTube. Endless scrolling. The usual frustration. The solution they tried was the usual one to confiscate the phone, restrict Wi-Fi, and threaten punishment. It worked for a day. Then the cycle returned. Instead, we tried something different. We asked, What if screen time is not the problem, but the waste of it is? The child was given a basic cybersecurity challenge. Nothing dramatic. Just simple tasks. How passwords are guessed. How fake links work. Why accounts get hacked. How to spot a scam message. The phone was no longer just for consumption. It became a tool. Within days, behaviour changed. The child started explaining phishing to siblings. They began questioning suspicious links. They stopped oversharing online, not because they were warned, but because they understood risk. Screen time did not reduce. It became purposeful. This is the reality many parents miss. Children are already digital. The danger is not the screen. The danger is ignorance. In another case, a teenager helped their parent avoid a mobile money scam. The message looked genuine. Same tone. Same urgency. But the child spotted the red flags immediately. Wrong link. Poor domain. Pressure language. Money saved. Lesson learned. Cybersecurity is not an IT subject. It is a life skill. At the Institute of Forensics & ICT Security (IFIS), we teach children to cross the road, not because we expect accidents, but because risk exists. The digital world is no different. Children are online earlier than ever, interacting with strangers, sharing data, clicking links, and building digital footprints they will live with for years. This holiday, the question is not whether your child will be on a screen. They will be. The question is whether they will leave the holiday with nothing but high scores and short videos or with a skill that builds confidence and judgment. Cybersecurity training does something powerful for children. It trains them to think before they click. To question before they trust. To understand systems, not just use them. It quietly builds discipline, curiosity, and responsibility. When school resumes, the difference shows. These children are not just users of technology. They are safer, sharper, and more aware. They do not panic online. They pause. This is how you future-proof a child. Not by banning technology, but by teaching mastery over it. Do not stop children from crossing the road. Teach them how to cross the road safely. Turn screen time into skill time. Enroll your child in cybersecurity training this holiday season. Because the digital world is not waiting The breach you never notice until it owns you Let me describe the most dangerous breach I have ever seen. No alarms went off. No systems went down. No ransom note appeared on the screen. Business continued as usual. That was the problem. In this case, a staff member clicked a harmless-looking link. Nothing happened. Or so it seemed. No files disappeared. No money moved. Everyone relaxed. What they did not know was that access had already been granted. For weeks, the attacker watched quietly. Emails. Approvals. Password resets. Internal conversations. Who reports to whom? How money is approved. Which controls are ignored when people are in a hurry? By the time the fraud happened, the breach was old news. The attacker did not break in loudly. They moved in politely. This is how most real breaches work. Not dramatic. Just silent. In another case, a company kept blaming staff for “carelessness.” But the truth was that the breach was not caused by one click. Habits caused it. Shared passwords. No monitoring. Outdated systems. Trust without verification. The breach was not technical. It was cultural. The most dangerous breaches are the ones you normalise. Small policy exceptions. One person with too much access. Alerts that are always ignored. Logs no one reads. Over time, attackers learn your organisation better than new employees do. And when they finally act, it looks like an insider job, because from the system’s point of view, it is. By the time money moves, reputations fall, or regulators arrive, the breach already owns you. It knows your weaknesses. It knows your delays. It knows your fear of disruption. This is why prevention is not enough. You must assume compromise and design for detection. Early. Constant. And relentlessly. Cybersecurity is not about stopping every attack. It is about spotting the quiet ones before they grow teeth. If you only react when systems fail, you are already late. The real question leaders should ask is simple. If someone got in today, how soon would we know? If the answer is “after damage,” then the breach is already ahead of you. The breach you never notice is the one that owns you.
Fight fraud. Build confidence. Get certified.
Some moments are unforgettable. I was sitting in a boardroom late in the evening, and the investigation was done. Everyone in that room knew fraud had happened because numbers told a clear story and the behaviour was obvious. However, when the report was projected on the screen, the room went quiet for the wrong reason. The word draft was all over the place. Pages were inconsistent, and conclusions appeared before evidence, screenshots were pasted without explanation, and dates did not align. There was no clear trail from fact to finding. The suspect’s lawyer did not need to argue much; however, the report argued against itself. And just like that, the case collapsed at the disciplinary hearing. Not because the suspect was innocent, but the investigator could not clearly connect the evidence, document it properly, or defend it with confidence. The suspect walked out. The organisation was left exposed, and good people paid the price. That day taught me something uncomfortable. Fraud does not survive because it is smart; it survives because investigations are often poorly done and poorly presented. Fraud is not just about money. It destroys trust, staff lose faith in leadership, and boards start doubting controls and Institutions become afraid to act. In Uganda, I have seen organisations live with known fraudsters simply because leadership feared losing in court or before regulators. Most professionals sense fraud long before they can prove it. This process keeps being bypassed, and numbers are always “rounded.” One person controls too much. A lifestyle does not match income, but instinct is not evidence. And a weak report can be more dangerous than no report at all. Confidence does not come from authority but comes from competence. Knowing how to structure an investigation, how to write a report that flows logically, how to move from evidence to conclusion without gaps, and how to face a board or a judge and calmly say, “This is what happened. This is how we know.” Certification changes how you show up. I have seen professionals transform. Stop guessing, stop over-explaining, and stop hiding behind jargon. Their reports become clear, thinking becomes sharp, and presence becomes steady. When things go wrong, organisations do not look for drama but look for evidence. So this is not about adding letters after your name, it’s about becoming dangerous to fraud and safe for the truth and protecting your organisation, your career, and your reputation. Fight fraud with skill. and build confidence. Through mastery. Register today for the January 2026 intake: https://forensicsinstitute.org/ifis-events-registration/ Get certified. Because in the real world, truth only survives when it is well-proven.
Transform your career with great investigation tools
Are you a victim of fraud? Are you working too hard and cannot see where the money goes? Are you an auditor or fraud investigator who would like to know how to collect water-tight evidence so that you become a darling of the prosecution and or your company team? Find the solution with us today. Whether you are a subject of an investigation or just staff who is at a company where fraud has taken place, you deserve to get peace of mind. Institute of Forensics & ICT Security (IFIS) helps you get all the facts so that you know the truth. In business, acting on opinion, hearsay, and audit reports is not advisable. As forensic investigators, we leave no stone unturned. Every transaction matters. Do you want to know the truth? Our forensics expert will help you know who did what, where, when, and how. We investigate IT, cyber, and general staff fraud. We will help you save on legal courts, protect your reputation, and first track your case. Never again will you beg staff to resign, even when you know they stole from you! Are some of your ex-employees threatening your reputation? Worry no more. We help dig all the facts. We conduct fraud and forensic investigations in line with the laws of Uganda. Our investigators are available to support your case as expert witnesses. We handle over 20 fraud investigation cases in Uganda, Kenya, and South Africa annually. The INVESTIGATE toolkit comes with: Customized report template to guide you to write a good court-admissible report Sample evidence collection templates so that you collect evidence that complies with the Evidence Act, laws of Uganda, or another country A fraud investigation methodology so that you know the step-by-step process of fraud investigations A suspect interview script template, so that you interview suspects and witnesses like a pro and get court-admissible evidence A statement-taking template/form so that you can create a strong case and be able to hold suspects or witnesses accountable as the investigation progresses And above all, tools to help you become an expert investigator, including the CFE fraud examination manual notes Want the peace of mind? Contact us today! To read more blogs, click here To download the 2026 IFIS Training Calendar, click here To be motivated, click here.
Digital frauds are killing businesses silently
We live in a digital world where basically the use of computers, the Internet, and other sophisticated technology is at its climax. Today, people pay bills, send and receive payments, review bank transactions and balances over the Internet, and use mobile phones. Such innovations are intended to make life easier and more comfortable. In less than 5 years, crime and fraud have changed from an anomaly of teenage vandals into a multi-billion-dollar industry. Every year, millions of people fall victim to such activities. There are many stories of cybercrime. Such hacking of some of America’s confidential information almost ruined diplomacy between many countries. Business experts argue that although it is very cheap and convenient to use online as a platform for business transactions, since most businesses have graduated from using traditional means of operation, the risks involved are immense. According to Mr. Mustapha Mugisa, a certified fraud examiner and the Former president of the Uganda chapter Association of Certified Fraud Examiners, digital fraud is one of the biggest emerging risks in Uganda as people embrace technology and its advancement. Uganda is currently in the take-off technology level, whereby every business is embracing technology and using the internet to reach out to the international market. In telecoms, insurance, and banking, for example, we have seen the emergence of mobile money transfer and mobile commerce (mCommerce) whereby people use their mobile phones to pay bills like water, electricity, DSTV, or even buy goods at the supermarkets. This is making our country ripe for high risks of digital fraud unless something is done now to protect the users. Such automation between many companies increases the risk of fraud because the more the interfaces, the higher the risks involved. Recently, almost all telecom companies operating in Uganda and offering mobile money services have entered into partnerships with other service providers to make money transactions easier and convenient, but this means that the scale of digital fraud increases to a greater magnitude. Unlike in the past, it was a bit difficult to steal a lot of information at once because manual documents are bulky. In most cases, fraud is directly or indirectly aided by internal staff, following the lack of training and sensitization on what digital fraud means and how to protect company information from social engineers. The use of technology is advancing daily, and there is no way one can avoid the wave of change. Companies should, however, train their staff and implement clear security policies – clearly explaining the forms of digital fraud, staff responsibilities in averting the same, and repercussions for failure to comply, among others. Companies should also aim to have a security-aware culture throughout the business,s as well as protecting their major digital resources from hackers, social engineers. Implementing secure firewalls to protect computers connected to the Internet should be a must across the board. It’s high time companies invest in digital security defense measures and ensure that they are integrated and adequately up to date to respond to the ever-changing and increasingly sophisticated technologies being utilized in such crime. Some people are connected to the Internet, but have not implemented security measures to block hackers.” He advises. With the introduction of wireless, a lot of people now prefer to work at the nearby cyber cafe since it is free, not knowing that they are at greater risk of being hacked into and revealing confidential company information, like business strategy, because the person controlling that network can access their profile easily. According to a study done by Pricewaterhouse Coopers’ Forensic Technology Solutions team, companies should use risk assessment strategies to ensure their investment is targeted towards those security controls that offer the greatest business benefit, as well as having a clear understanding of their legal rights and responsibilities in relation to digital fraud, and aim to implement internationally-recognized standards of best practice. Fraud in the digital world is not likely to end any sooner because the more the technology becomes sophisticated, the higher the risks involved. It’s high time the government, through regulatory bodies like the Bank of Uganda, stepped in to provide assurances of the security of emerging technologies like mobile money services, thereby reducing identity theft, fraud, money laundering, and embezzlement from companies. Take the lead in the fight against digital fraud Digital fraud will continue to evolve. The only question is: Will you evolve with it? Join the January 2025 Certified Fraud Examiner Class and position yourself as a fraud expert Uganda needs in this critical digital era. Secure your discounted spot today. Become a Certified Fraud Examiner. Protect the future. Enroll now — limited seats available. Download our 2026 Training Calendar: https://forensicsinstitute.org/download/ifis-training-calendar-2026/
Why every child needs cybersecurity skills this holiday
Let me begin with a simple truth that many parents in Uganda still underestimate. Your child is already living in a world you barely understand. Their friendships are digital. Their assignments are digital. Their identity is digital. Their future wealth will be digital. And their biggest risks? Also, digital. Pretending cybersecurity is an “adult issue” is the fastest way to raise a vulnerable child in a dangerous world. This holiday, while everyone else is preparing to teach children new dances, cooking lessons, or driving skills, I want us to think differently. Your child does not need more entertainment. They need digital self-defence. And the earlier they learn it, the safer their future. The moment a harmless game became an investigation Two weeks ago, during a cybersecurity training session, a mother approached me worried about her son, let us call him Peter, a slender boy, always wearing bright T-shirts and moving with the restless attention of someone raised on screens. He had clicked on a “free upgrade” link inside a mobile game. Within hours, his phone, the one he borrowed, started sending strange notifications. Data got consumed at an alarming rate. Contacts synced to unknown servers. That is how easily a child can compromise an entire family. This is not a theoretical risk. Children are now the softest entry point for cybercriminals into homes, offices and top security places. They trust too quickly. They click too fast. They share too openly. And they rarely tell their parents until it is too late. That is why cybersecurity is no longer an optional talent. It is a survival skill. Why your child is the new target Children are now the easiest way for criminals to enter your household. And they do it quietly. Children trust screens more than people they know. They do not evaluate risk, they follow excitement. Their digital footprints outlive childhood. Criminals design attacks specifically for young users’ behaviour. Every online profile your child creates today builds a lifelong digital identity. Mistakes made at age 10 can haunt them at 25 in job applications. If you are with your child, ask them to Google their own name and see how much of their identity is already public. The shock alone begins the learning. A case of a simple school assignment gone wrong Subject B, a tall girl around 11, usually wearing her school uniform even during holidays, downloaded a “free PDF converter” for her homework. Hidden inside was spyware. It captured keystrokes, screen activity, and camera access. Her father did not believe it until we replayed a log showing screenshots of their living room taken without permission. Let us think differently. Children are not careless. They are untrained. Free tools often carry malicious add-ons. Children rarely read permission requests. Household devices become compromised silently. Attackers exploit educational habits, downloading, searching, sharing. Try it now. Ask your child to explain what each app on their phone does, what permissions it uses, and why. The gaps will reveal their vulnerabilities. The digital world your child walks into blindly We must stop pretending children are safe because they are “just on TikTok” or “only chatting with classmates.” Digital danger is not dramatic. It is quiet, gradual, and psychological. Social engineering now targets children first. Identity theft begins with small personal details. Cyberbullying leaves long-term behavioural scars. Strangers disguise themselves as peers effortlessly. What else to know? A child’s first online trauma shapes their confidence for life. I got a shock of my life when I did a simple online test at home. You too can try it. Run a mock phishing test at home. Send your child a harmless but deceptive link. Discuss why they clicked or didn’t click. This is the most practical lesson you can give. Why the holiday is the perfect time to train them Children learn best when routines are broken. Holidays give you uninterrupted time to build digital discipline without school pressure. Holidays expose children to longer screen time. Boredom pushes them into riskier digital spaces. Peer challenges increase unsafe behaviours. Supervision reduces; curiosity increases. Let us think differently. Cybersecurity is not about fear. It is about empowerment. Spend one hour daily teaching them simple tasks, checking browser history, identifying fake URLs, and turning on two-factor authentication. Or you can enrol them at the Institute of Forensics & ICT Security holiday program. This small routine builds lifelong digital instincts. The skills every child must have by the New Year If you want your child to thrive in a future driven by AI, automation, and digital identity, these baseline skills are non-negotiable. How to identify suspicious links and apps. How to create strong, memorable passwords. How to configure privacy settings on every platform they use. How to recognise manipulation—emotional, social, and digital. What else to know? These are the same skills required of cybersecurity professionals. Your child is building a career foundation without even realising it. Activity: Ask your child to “teach you” how to protect your phone. When they teach, they learn twice. We must face a hard truth. The school system is not built for the digital age. The curriculum is slow, the risks are fast. Teachers are overwhelmed; children are overexposed. Keep in mind that Most schools lack cybersecurity programs. Teachers often do not understand modern threats. ICT lessons focus on typing, not self-defence. Parents wrongly assume schools provide safety. Parental leadership is the new frontline of cyber protection. I encourage parents to create a weekly “digital briefing” with their child. Review their digital week the same way CEOs review operations. A deeper look at Subject A and Subject B Remember our two cases? Subject A’s curiosity led to malware infiltration that compromised household contacts. Subject B’s homework tool gave a stranger silent access to her home. The lesson? Cybersecurity failure rarely looks dangerous at first. It looks innocent. It looks harmless. It looks like “something small.” Children do not understand digital consequences. Small digital mistakes have a real-world impact and leave lifelong pain Attackers
The technique of persuasive report writing
If there is one truth I have learned from decades of digital forensics and fraud investigations, it is that you never win an investigation in the field. You win it in the report. Courtrooms do not see your sleepless nights, your reconstructed log files, your digital breadcrumbs, or the clever trap you set at 3:07 a.m. They see your report. A weak report frees the guilty. A persuasive, fact-driven one convicts without drama. Let me share an experience to drive the point home. I will use generic descriptions to keep it professional, but I want you to feel the tension as if you were seated next to me in the digital lab. It began with a procurement refund. Nothing dramatic. A routine payment flagged by a junior auditor who had sharp eyes and a habit of taking notes on sticky papers. The staff member responsible, Subject 1, a short man in his late 30s with thick eyebrows and a soft-spoken demeanour, claimed it was a simple mistake. Mistakes rarely leave trails. But this one did. His device logged into the system at 11:48 p.m., long after the office had closed. The login came through a residential IP outside Kampala. The VPN he used had the unmistakable signature of a free-tier application. And the approval timestamp on the refund matched the exact minute his device connected. This was no mistake. This was a story waiting to be written, structured, and presented in a way that convinces without accusing. Why investigators lose cases Many investigators collect evidence like people collect souvenirs. Random, scattered, emotional. But persuasive report writing requires discipline. A good report does three things: It tells the truth plainly. It links events logically. It leads the reader to a conclusion without dragging them there. During log reconstruction, another unusual pattern emerged. Someone with customer support credentials had system access far above their role. That someone was Subject 2, a tall woman in her early 30s, always wearing oversized headphones and walking with a confident stride that made her presence felt before she spoke. Her role did not require approval rights, yet she had them. Quietly added. Quietly used. In a poorly written report, this would appear as a vague statement: “Subject 2 had unnecessary access.” That is not persuasive. A good report states: “Subject 2 held administrator privileges for Module X from 14 March to 22 June. System records show these privileges were not authorised by IT, HR, or management.” Facts speak. Your job is to present them without dramatising. Reconstructing the fraud pattern When we mapped six months of server interactions, a sequence emerged Subject 1 initiated small refund transactions outside working hours. Subject 2 approved them within minutes. The funds consistently landed in two mobile money accounts linked to a dormant sim recently reactivated. Withdrawals occurred within 15 minutes of deposit, at locations far from the main office. Notice the structure. Actions. Timing. Correlation. Behaviour. A persuasive report writes like a silent camera, not like a prosecutor. The power of digital breadcrumbs In digital forensics, nothing truly disappears. A careless tap on a phone. A forgotten location setting. A reused password. These are not mistakes; they are signatures. In this case, Subject 1 forgot to disable location sharing. One of the approvals placed him near a small food joint in Najjera. The GPS placed him there at the exact minute an unauthorised login occurred. Your report does not need to accuse him. It only needs to say: “Geolocation metadata from Device A (IMEI xxxx) places Subject 1 at Coordinates (xx, yy) at 21:42. This timestamp coincides with the approval action recorded under Username Z.” There is no need for adjectives. Evidence is its own witness. The technique: write to convict, not to entertain Courts and disciplinary committees operate on clarity, not emotion. A persuasive report uses four principles. Sequence – events arranged chronologically Correlation – linking logs, actions, and outcomes Neutral language – no adjectives, no passion Evidence anchoring – every assertion tied to a source When you master these, you no longer write a report. You build a conviction pathway. Let us return to the case. Building the narrative that stands in court Your report should walk the reader through the fraud as if they are discovering it themselves. For example: “On 11 April at 23:48, an unauthorised login occurred via IP xxx. The credentials belonged to Subject 1. Two minutes later, an approval for Refund Code 0098 was entered through Subject 2’s credentials. Both actions originated from devices not registered under the organisation’s asset list.” There is no accusation. There is only fact, timing, and source. That is what persuades. Most organisations write investigative reports like internal memos: short, vague, loaded with assumptions, and designed to please management. That is why most cases collapse. The winning investigator does not write to please; they write to prove. The secret is to structure the report so that the conclusion is irresistible. You do not need bold statements such as “Subject 1 committed fraud.” Instead, your evidence leads the reader to that conclusion gently, firmly, and logically. If your narrative is clear enough, the reader will convince themselves. To understand persuasive reporting, try this exercise with your team. Take a simple event, such as a lost office key. Collect every observable fact without opinion. Reconstruct the timeline from these facts. Write a one-page report stating only what is provable. Read it aloud. If no one can dispute your sequence, you have written persuasively. This exercise exposes how much noise investigators add unnecessarily. What else should you know A persuasive forensic report has five components. Introduction: What triggered the investigation Methodology: how the evidence was collected Findings, facts arranged logically Analysis, connecting patterns and evidence Conclusion, a measured statement based strictly on facts The conclusion must never accuse. It must recommend. Accusation belongs to disciplinary or legal authorities. Investigators present truth, not judgment. A good conclusion reads like this, “Based on the evidence presented, the activities of Subjects 1
The forensic hunt, tracing fraud through digital breadcrumbs
I have spent more than two decades in digital forensics, and one truth keeps confronting me: fraudsters do not vanish into thin air. They leave trails, subtle, scattered, and often arrogant trails. The tragedy is that organisations rarely look in the right place, at the right time, with the right discipline. That is why so many cases in Uganda, from banks to insurance firms to government agencies, quietly die in boardrooms, not due to lack of evidence, but due to lack of proper forensic pursuit. Let us walk through a real scenario. I will use generic descriptions to keep it professional. Visualise it as if I am taking you into a live investigation room. The unusual transfer It began with a single suspicious payment. An internal auditor noticed a strange mobile money transaction tagged to a procurement refund. The staff member responsible, let us call him Subject 1, a soft-spoken man in his early 30s, always in oversized shirts that made him look smaller than he was insisted it was an error. Errors do not repeat. And in this case, it was not the transaction, but the digital trail around it, that raised my eyebrows. The device he used was logged into the system after office hours. The IP address did not match the organisation’s network. A VPN with free-tier characteristics was hiding the location. And the timing aligned perfectly with the moment the payment left the organisation’s account. When you investigate long enough, you learn not to chase drama. You chase patterns. Drama is for television. In forensics, patterns tell the truth. Where breadcrumbs start to speak Fraud today is rarely analogous. Even cash-based fraud begins with a digital footprint. A password typed too fast. A login attempts from a phone the user “forgot.” A deleted message that was backed up somewhere else. Fraudsters underestimate the permanence of their own behaviour. While following Subject 1’s activities, I noticed a second actor Subject 2, a tall woman with sharp features and a habit of wearing large headsets even when not listening to anything. She worked in customer support. And yet, somehow, her workstation had administrator-level access for a module she never used. That is not a red flag. That is a red billboard. When organisations allow privilege creep where employees quietly accumulate system access, they should not have, you no longer require sophisticated hackers. You create them internally. Following the trail deeper We isolated three key breadcrumbs: The IP address mismatch The out-of-role system access Mobile money deposits repeatedly landing on a number linked to an unregistered sim card When you see three digital anomalies within the same window of time, the odds of coincidence become statistically insulting. But instead of jumping to conclusions, a mature forensic investigator builds a hypothesis, tests it, and breaks it. I always tell boards: the worst investigators are the ones who rush to “the culprit.” The best ones rush to evidence. We pulled the server logs, network metadata, and mobile phone records. Then we mapped every transaction over six months. This is where many organisations panic: the fear of what they might find. Truth is expensive. But ignorance is catastrophic. How the scheme worked Subject 1 initiated payments disguised as supplier refunds. Subject 2 escalated system privileges to approve them. The digital movement was subtle, small amounts, spread across different dates, routed through two mobile money accounts and a dormant bank account belonging to a distant acquaintance. Here is the interesting part. The amounts were too small to trigger internal alarms but large enough to accumulate significantly over time. That is the new face of fraud in Uganda, slow, patient theft. It thrives in organisations where leadership only reacts to large explosions and ignores small smoke. Most organisations believe fraud is caught by “strong controls.” I disagree. Controls only detect predictable fraud. It is behavioural analysis, cross-matching logs, and understanding human patterns that expose the real schemes. Technology does not commit fraud; human motive does. And motives echo loudly through digital behaviour. Once we analysed login times, data entry patterns, device identifiers, and mobile money flows, the scheme became embarrassingly clear. You could predict the next attempted transfer before it happened. In one of the logs, Subject 1 forgot to disable location sharing on his phone. That single oversight placed him at a small local restaurant at the exact moment the irregular approval was logged. Digital breadcrumbs do not lie. Human beings do. What the board must understand Digital forensics is not about recovering deleted files. It is about reconstructing truth. In this case, we mapped the fraud from origin to execution: Access escalation Transaction manipulation Digital concealment attempts Proceeds routing Withdrawal behaviour When you show this trail to leaders, they often ask, “How did we miss this?” The honest answer is simple. It is because no one was looking for it. Ugandan firms still treat cybersecurity and digital forensics as IT support. Yet most fraud is authorised using legitimate credentials and insider access. This requires governance, not gadgets. What else you should know Digital forensics is not magic. It is meticulous discipline. If your organisation. does not centralise logs does not restrict privilege escalation does not segment networks does not review after-hours activity …you are not running a secure enterprise. You are running a house without doors, hoping no one walks in. When I train teams, I run a simple exercise. I ask them to check their personal email accounts and view their login history. Almost every time, someone discovers a login from a device or location they do not recognise. The real shock comes when they see how long that device has had access. The same shock awaits many organisations. Fraud will evolve. Human beings will not. Greed will remain constant. But so will digital footprints. The only question is whether your organisation is disciplined enough to follow them. Digital breadcrumbs do not disappear. They simply wait for the right investigator. And in a world where every action, login, tap,