Risk management is not just for enterprises with dedicated security teams and six-figure compliance budgets. I built this risk register as a practical GRC exercise using a small fictional non-profit as my subject. Here's what I did, why I made the decisions I made, and what the process actually taught me.
For this exercise, I used a small fictional non-profit as my subject. It handles member and donor data, manages a physical facility, runs on a volunteer workforce, and has no dedicated IT or security staff. That context matters. Risk management does not exist in a vacuum. The size, maturity, and resources of an organization shape every decision you make about likelihood, impact, and what realistic remediation looks like.
How I Structured It
Before identifying a single risk, I locked in the structure. A risk register is only useful if it is consistent. Every entry needs to capture the same information so you can compare, prioritize, and track over time.
Here is what I included for every risk:
- Risk ID — a unique identifier for tracking and cross-referencing
- Risk Description — a plain-language statement of what could go wrong
- Category — the domain the risk falls under: Cybersecurity, Operational, Compliance, or Financial
- Likelihood — how probable the risk is: Low, Medium, or High
- Impact — how damaging it would be if it occurred: Low, Medium, or High
- Risk Score — a numeric value from the Likelihood x Impact matrix
- Risk Level — a qualitative label for stakeholder readability
- Controls in Place — what is already happening, even informally
- Recommended Actions — specific, actionable steps to reduce the risk
- Risk Owner — the role accountable for managing this risk
- Status — where remediation stands: Open, In Progress, or Mitigated
One column I want to highlight: Controls in Place. A lot of risk assessments skip straight to recommendations without honestly documenting the current state. I think that is a mistake. Documenting existing controls, even weak or informal ones, gives you a real baseline. It tells you where you are starting from, which is what makes improvement measurable and recommendations credible.
The Scoring Methodology
I used a 3x3 Likelihood x Impact matrix. Each dimension is rated Low (1), Medium (2), or High (3), and the Risk Score is the product of the two values.
Low Impact (1)Medium Impact (2)High Impact (3)Low Likelihood (1)123 — Medium Likelihood (2)246 — High Likelihood (3)369
Risk Levels map as follows:
- 1–2: Low
- 3–4: Medium
- 5–9: High
I chose a 3x3 deliberately. More mature programs use 5x5 matrices or quantitative models, and eventually that is where this organization should head. But at this maturity level, a 3x3 produces actionable results without adding complexity the organization cannot use. The goal is structured thinking, not perfection.

Walking Through the Register
The register identified 11 risks across four categories. Here is what I found and why I scored things the way I did.
Cybersecurity
R-001: Unauthorized access to donor or member data (Score: 9, High) This is the highest-scoring risk in the register and it is not close. Password-only authentication with no MFA protecting sensitive donor and member data is a significant gap. Credential-based attacks are common, the data is valuable, and the controls currently in place are minimal. Recommended actions: implement MFA, enforce stronger password policies, and conduct periodic access reviews.
R-002: Phishing attacks on officers or members (Score: 6, High) No phishing awareness training, no email filtering tools. The impact of a successful phishing attack is high: compromised accounts, potential data exposure, reputational damage. I rated likelihood Medium rather than High because the organization's low profile likely reduces its attractiveness as a target. That said, low profile is not a security control, and it is not something to rely on.
R-007: Poor password management and single account ownership (Score: 3, Medium) Accounts are managed individually with no centralized password manager and no documented ownership or recovery procedures. The risk here is not just unauthorized access. It is operational continuity. If the one person who holds the credentials to a critical account leaves the organization, that account may be permanently inaccessible. That is a real problem that plays out in small organizations more than people expect.
Operational
R-003: Loss of critical records due to lack of data backups (Score: 6, High) Data is scattered across individual devices and shared platforms with no formal backup schedule. I scored likelihood Medium rather than Low because informal storage arrangements fail more often than people assume. The impact of permanent data loss for a non-profit, donor records, financial history, meeting minutes, is severe and potentially irreversible.
R-008: Dependency on a single individual for critical operations (Score: 2, Low) This is often called a key person dependency or bus factor risk. If one person holds all institutional knowledge for a critical function and they leave, operations stall. The fix is straightforward: document procedures and cross-train at least one backup person. This scored Low because the likelihood of it becoming a short-term crisis is low, but left unaddressed it compounds over time.
R-010: Unsafe facility security practices (Score: 3, Medium) The organization has security cameras and a designated monitor at the entrance during meetings. Those controls are meaningful, which is exactly why documenting them matters. The gap is that there is no formal procedure for closing the facility, and individuals sometimes secure the building alone. A two-person rule for closing and a written securing procedure are low-cost, high-value fixes.
R-011: Lack of emergency preparedness procedures (Score: 6, High) No documented emergency response procedures for fire, medical incidents, or security breaches. No drills. For an organization that hosts gatherings in a physical facility, this is a meaningful gap. What makes it tractable is that the remediation is almost entirely documentation and communication, not technology or budget.
Compliance
R-004: Improper handling of member personal information (Score: 2, Low) This scored Low, but I want to be clear about why I still included it. Without a formal data handling and privacy policy, the organization has no framework for how sensitive information should be stored, shared, or disposed of. Privacy expectations, regulatory and otherwise, are only going to increase. Getting ahead of this now is easier than reacting later.
R-009: Excessive physical access due to lack of a key policy (Score: 6, High) Keys distributed based on trust with no centralized inventory, no sign-out process, and no periodic access review. This scored High because the combination of realistic likelihood and meaningful impact is hard to argue with. Unauthorized physical access to sensitive documents or the facility itself is not a theoretical concern for a non-profit with member records on-site.
Financial
R-005: Misuse or misallocation of funds due to lack of financial controls (Score: 3, Medium) Financial responsibilities are assigned to designated officers, but there is no formal segregation of duties and no documented financial controls. This risk is marked Mitigated in the register because some informal oversight exists. That label deserves some honesty though. Mitigated does not mean resolved. Informal oversight is better than nothing. Documented controls with segregation of duties is better than informal oversight.
What the Register Reveals
A few patterns stand out when you look at this holistically.
The highest-priority risks cluster around cybersecurity and physical access. In small organizations, these are often the least formally managed because there is no dedicated person to own them. Security by trust is common. It is also fragile.
Several risks share the same root cause: the absence of documented procedures. R-003, R-008, R-010, and R-011 all exist, at least in part, because informal practices have never been written down. Documentation costs time, not money. For a resource-constrained non-profit, that makes it one of the highest-return investments available.
Risk ownership matters more than people think. Every risk in this register has a named owner. Without that, a risk register is a list of observations. With it, it becomes a plan with accountability attached.
Framework Alignment
This register was not built against a specific framework, but the methodology aligns with a couple that are worth calling out.
The risk assessment process, identifying risks, rating likelihood and impact, and scoring them, maps directly to NIST SP 800–30, which provides guidance on conducting risk assessments for information systems. That was the closest framework to what I actually did here.
The overall structure, identifying risks, assigning ownership, tracking status, and working toward remediation, is also consistent with the principles in ISO 31000, which is a general risk management standard applicable across any type of organization.
For the cybersecurity and compliance gaps identified, the control categories are consistent with what tends to surface in an ISO/IEC 27001 gap assessment, particularly around missing policies and undocumented procedures. That said, 27001 is a full information security management system standard. Claiming direct alignment without that broader context would be a stretch.
View the full risk register here
Thanks for reading. If you found this useful, follow along for more content on GRC, risk management, and privacy.