Calculating Risk Levels
Introduction to Risk Calculation
Risk calculation is the cornerstone of effective information security risk management. It transforms qualitative assessments into quantifiable measures that enable:
- Objective Decision-Making: Compare risks on a common scale
- Resource Allocation: Prioritize security investments based on risk levels
- Treatment Prioritization: Address the most critical risks first
- Stakeholder Communication: Present risks in understandable terms
- Progress Tracking: Monitor risk reduction over time
- Compliance Demonstration: Show systematic risk management to auditors
The fundamental principle is simple yet powerful: Risk = Likelihood × Impact
However, the practical application of this formula requires careful consideration of multiple factors, control effectiveness, and risk context.
The Risk Calculation Formula
Basic Formula
Risk Level = Likelihood Score × Impact Score
Understanding Each Component
Likelihood (Probability): The chance that a threat will exploit a vulnerability
- Considers threat frequency
- Vulnerability exploitability
- Existing control effectiveness
- Historical incident data
- Threat actor capability and motivation
Impact (Consequence): The magnitude of harm if the risk materializes
- Financial losses
- Operational disruption
- Reputational damage
- Legal/regulatory consequences
- Data confidentiality, integrity, or availability loss
Risk Score: The product of likelihood and impact, representing overall risk exposure
Risk Scoring Scales
5×5 Risk Matrix (Most Common)
Likelihood Scale
| Score | Level | Description | Frequency |
|---|---|---|---|
| 5 | Almost Certain | Expected to occur in most circumstances | >80% chance or >10 times/year |
| 4 | Likely | Will probably occur in most circumstances | 60-80% chance or 3-10 times/year |
| 3 | Possible | Might occur at some time | 40-60% chance or 1-3 times/year |
| 2 | Unlikely | Could occur at some time | 20-40% chance or once every 1-3 years |
| 1 | Rare | May occur only in exceptional circumstances | <20% chance or <once every 3 years |
Impact Scale
| Score | Level | Description | Financial | Operational | Reputational |
|---|---|---|---|---|---|
| 5 | Catastrophic | Severe impact on organization | >$1M loss | Complete shutdown >1 week | National media coverage |
| 4 | Major | Significant impact | $500K-$1M | Critical systems down 2-7 days | Industry-wide attention |
| 3 | Moderate | Noticeable impact | $100K-$500K | Major disruption 1-2 days | Local media coverage |
| 2 | Minor | Limited impact | $10K-$100K | Minor disruption <1 day | Internal awareness only |
| 1 | Insignificant | Minimal impact | <$10K | Negligible disruption | No external awareness |
Note: Adjust these thresholds based on your organization's size, industry, and risk appetite.
Risk Score Interpretation
| Risk Score Range | Risk Rating | Color Code | Action Required |
|---|---|---|---|
| 20-25 | Extreme | Red | Immediate action - Escalate to executive leadership |
| 15-19 | High | Orange | Senior management attention - Treat within 30 days |
| 10-14 | Medium | Yellow | Management attention - Treat within 90 days |
| 5-9 | Low | Green | Monitor - Treat within 12 months |
| 1-4 | Very Low | Blue | Accept - Review annually |
Step-by-Step Risk Calculation Process
Step 1: Identify the Risk Scenario
Clearly define what you're calculating:
- Asset: What is at risk?
- Threat: What could go wrong?
- Vulnerability: What weakness enables the threat?
- Impact: What would be the consequence?
Example: Unauthorized access to customer database due to weak password policy
Step 2: Assess Inherent Likelihood
Determine the likelihood WITHOUT considering existing controls:
Factors to Consider:
- Threat source motivation and capability
- Vulnerability exploitability
- Historical incident data
- Industry threat intelligence
- Environmental factors
Example Assessment:
- Threat: External hackers actively target customer databases (High motivation)
- Vulnerability: Weak passwords (Easily exploitable)
- Industry data: Common attack vector
- Inherent Likelihood: 4 (Likely)
Step 3: Assess Inherent Impact
Determine the impact WITHOUT considering existing controls:
Factors to Consider:
- Number of records/assets affected
- Data sensitivity classification
- Regulatory requirements (GDPR, CCPA, etc.)
- Financial exposure
- Operational dependencies
- Reputational sensitivity
Example Assessment:
- 100,000 customer records including PII
- GDPR penalties: up to €20M or 4% revenue
- Media attention likely
- Customer trust damage
- Inherent Impact: 5 (Catastrophic)
Step 4: Calculate Inherent Risk
Inherent Risk = Inherent Likelihood × Inherent Impact
Inherent Risk = 4 × 5 = 20 (Extreme)
Inherent Risk represents the risk level before any controls are applied.
Step 5: Identify Existing Controls
Document all controls currently in place:
Example Controls:
- Password complexity policy (minimum 12 characters)
- Multi-factor authentication (MFA)
- Account lockout after 5 failed attempts
- Annual security awareness training
- Database access logging and monitoring
- Quarterly access reviews
- Intrusion detection system (IDS)
Step 6: Assess Control Effectiveness
Evaluate how well each control reduces likelihood or impact:
Control Effectiveness Rating
| Rating | Effectiveness | Reduction | Description |
|---|---|---|---|
| 3 | High | 60-80% | Control is well-designed, consistently applied, and regularly tested |
| 2 | Medium | 30-50% | Control exists but has gaps in design, implementation, or monitoring |
| 1 | Low | 10-20% | Control is poorly implemented, inconsistently applied, or rarely tested |
| 0 | None | 0% | Control doesn't exist or is completely ineffective |
Example Control Assessment:
| Control | Type | Effectiveness | Reduces | Notes |
|---|---|---|---|---|
| MFA | Preventive | 3 (High) | Likelihood | 95% user adoption, enforced |
| Password Policy | Preventive | 2 (Medium) | Likelihood | Some users still use weak patterns |
| Access Logging | Detective | 2 (Medium) | Impact | Logs reviewed weekly |
| Account Lockout | Preventive | 3 (High) | Likelihood | Automatic, always active |
| IDS | Detective | 2 (Medium) | Likelihood & Impact | Alerts sometimes missed |
Step 7: Calculate Residual Likelihood
Apply control effectiveness to reduce inherent likelihood:
Method 1: Percentage Reduction
Residual Likelihood = Inherent Likelihood × (1 - Control Effectiveness %)
Method 2: Score Reduction (Simplified)
- High effectiveness controls: Reduce by 2 levels
- Medium effectiveness controls: Reduce by 1 level
- Low effectiveness controls: Reduce by 0.5 levels
Example Calculation (Method 2):
- Inherent Likelihood: 4 (Likely)
- MFA (High): -2 levels
- Password Policy (Medium): -1 level
- Account Lockout (High): -2 levels
- Combined reduction: -2 levels (maximum realistic reduction)
- Residual Likelihood: 2 (Unlikely)
Step 8: Calculate Residual Impact
Apply control effectiveness to reduce inherent impact:
Example Calculation:
- Inherent Impact: 5 (Catastrophic)
- Access Logging (Medium): -1 level (faster detection, reduced exposure)
- IDS (Medium): -1 level (limits data exfiltration)
- Combined reduction: -1 level
- Residual Impact: 4 (Major)
Step 9: Calculate Residual Risk
Residual Risk = Residual Likelihood × Residual Impact
Residual Risk = 2 × 4 = 8 (Low)
Residual Risk represents the risk level after applying existing controls.
Step 10: Determine Risk Treatment
Compare residual risk against risk acceptance criteria:
| Residual Risk | Treatment Decision |
|---|---|
| 8 (Low) | MONITOR - Risk within acceptable range, continue monitoring |
Treatment Options:
- If residual risk is acceptable: Accept and monitor
- If residual risk is unacceptable: Apply additional treatment (modify, avoid, or transfer)
Control Effectiveness Assessment
Testing Control Effectiveness
1. Design Effectiveness
- Is the control designed to address the identified risk?
- Are control objectives clearly defined?
- Does the control cover all relevant scenarios?
2. Implementation Effectiveness
- Is the control fully implemented as designed?
- Are there any gaps in deployment?
- Is coverage comprehensive across the organization?
3. Operating Effectiveness
- Does the control operate consistently?
- Are control activities documented and evidenced?
- How frequently does the control fail?
Evidence Collection Methods
| Control Type | Testing Method | Evidence Examples |
|---|---|---|
| Automated | System query/report | Configuration screenshots, logs |
| Manual | Sampling | Process documentation, approval records |
| Preventive | Attempt bypass | Penetration test results, failed access logs |
| Detective | Review alerts | Incident records, monitoring reports |
Control Effectiveness Documentation
For each control, document:
Control ID: C-001
Control Name: Multi-Factor Authentication
Control Type: Preventive (Likelihood Reducing)
Description: All users must authenticate using password + MFA token
Effectiveness Assessment:
- Design: High - Aligned with NIST 800-63B
- Implementation: High - 98% user coverage (95% requirement met)
- Operating: High - 99.7% uptime, < 0.1% bypass attempts
- Overall Rating: 3 (High)
Testing Evidence:
- MFA configuration review (Date: 2024-11-15)
- User enrollment report showing 98% coverage
- Failed authentication logs
- Annual penetration test results
Last Tested: 2024-11-15
Next Review: 2025-02-15
Complete Risk Calculation Examples
Example 1: Email Phishing Attack
Risk Scenario
- Asset: Employee workstations and credentials
- Threat: Phishing emails leading to credential theft
- Vulnerability: Lack of email filtering and user awareness
Inherent Risk Assessment
- Inherent Likelihood: 5 (Almost Certain) - Phishing is constant and increasing
- Inherent Impact: 4 (Major) - Could lead to ransomware, data breach
- Inherent Risk: 5 × 4 = 20 (Extreme)
Existing Controls
- Email spam filter (Medium effectiveness - Rating: 2)
- Security awareness training (Medium effectiveness - Rating: 2)
- Email banner warnings for external emails (Low effectiveness - Rating: 1)
- Endpoint antivirus (Medium effectiveness - Rating: 2)
Residual Risk Assessment
- Residual Likelihood: 3 (Possible) - Controls reduce by 2 levels
- Residual Impact: 3 (Moderate) - Early detection reduces impact by 1 level
- Residual Risk: 3 × 3 = 9 (Low)
Treatment Decision
Monitor with additional controls recommended:
- Implement advanced email threat protection (reduces likelihood to 2)
- Deploy phishing simulation program (reduces likelihood to 2)
- Target residual risk: 6 (Low)
Example 2: Unpatched Server Vulnerability
Risk Scenario
- Asset: Internet-facing web server
- Threat: Exploitation of known critical vulnerability
- Vulnerability: Missing security patches
Inherent Risk Assessment
- Inherent Likelihood: 4 (Likely) - Publicly known exploit available
- Inherent Impact: 5 (Catastrophic) - Complete server compromise
- Inherent Risk: 4 × 5 = 20 (Extreme)
Existing Controls
- Web application firewall (Medium effectiveness - Rating: 2)
- Intrusion detection system (Medium effectiveness - Rating: 2)
- Monthly vulnerability scans (Low effectiveness - Rating: 1)
- Network segmentation (High effectiveness - Rating: 3)
Residual Risk Assessment
- Residual Likelihood: 3 (Possible) - WAF provides partial protection
- Residual Impact: 4 (Major) - Segmentation limits lateral movement
- Residual Risk: 3 × 4 = 12 (Medium)
Treatment Decision
Treat immediately:
- Apply security patch (reduces likelihood to 1)
- Implement automated patch management (reduces likelihood to 1)
- Target residual risk: 4 (Very Low)
Example 3: Insider Threat - Data Theft
Risk Scenario
- Asset: Confidential intellectual property
- Threat: Malicious or negligent employee data exfiltration
- Vulnerability: Excessive user privileges
Inherent Risk Assessment
- Inherent Likelihood: 3 (Possible) - Historical incidents in industry
- Inherent Impact: 5 (Catastrophic) - Loss of competitive advantage
- Inherent Risk: 3 × 5 = 15 (High)
Existing Controls
- Data loss prevention (DLP) system (High effectiveness - Rating: 3)
- User access reviews quarterly (Medium effectiveness - Rating: 2)
- Employee background checks (Low effectiveness - Rating: 1)
- Exit interview procedures (Low effectiveness - Rating: 1)
- Activity logging and monitoring (Medium effectiveness - Rating: 2)
Residual Risk Assessment
- Residual Likelihood: 2 (Unlikely) - DLP and monitoring reduce by 1 level
- Residual Impact: 4 (Major) - DLP limits data volume that can be stolen
- Residual Risk: 2 × 4 = 8 (Low)
Treatment Decision
Monitor with enhancements:
- Implement privileged access management (PAM)
- Enhance user behavior analytics
- Target residual risk: 6 (Low)
Example 4: Third-Party Vendor Breach
Risk Scenario
- Asset: Customer data processed by cloud service provider
- Threat: Security breach at vendor leading to data exposure
- Vulnerability: Limited visibility into vendor security practices
Inherent Risk Assessment
- Inherent Likelihood: 3 (Possible) - Vendor breaches are common
- Inherent Impact: 5 (Catastrophic) - Regulatory penalties, customer trust loss
- Inherent Risk: 3 × 5 = 15 (High)
Existing Controls
- Vendor security assessment (Medium effectiveness - Rating: 2)
- Contractual security requirements (Low effectiveness - Rating: 1)
- Data encryption in transit and at rest (High effectiveness - Rating: 3)
- Annual vendor audits (Medium effectiveness - Rating: 2)
Residual Risk Assessment
- Residual Likelihood: 3 (Possible) - Assessment doesn't reduce likelihood much
- Residual Impact: 3 (Moderate) - Encryption significantly reduces impact
- Residual Risk: 3 × 3 = 9 (Low)
Treatment Decision
Monitor with ongoing vendor management:
- Require SOC 2 Type II certification
- Implement continuous vendor monitoring
- Add cyber insurance coverage (transfer)
- Target residual risk: 6 (Low)
Common Risk Calculation Mistakes
Mistake 1: Confusing Inherent and Residual Risk
Wrong Approach: Calculating inherent risk with controls already factored in
Correct Approach:
- Always start with inherent risk (no controls)
- Then apply control effectiveness
- Calculate residual risk
Why It Matters: You need to understand the true risk exposure to validate control investments.
Mistake 2: Overestimating Control Effectiveness
Wrong Approach: Assuming controls are 100% effective without testing
Correct Approach:
- Test controls regularly
- Document evidence of effectiveness
- Adjust ratings based on actual performance
- Consider control gaps and failures
Reality Check: Even "good" controls rarely exceed 80% effectiveness
Mistake 3: Double-Counting Controls
Wrong Approach: Applying the same control to reduce both likelihood and impact multiple times
Correct Approach:
- Clearly categorize each control
- Apply to likelihood OR impact (some can affect both, but count once)
- Document the reduction logic
Example: MFA reduces likelihood by preventing unauthorized access, not impact.
Mistake 4: Ignoring Control Dependencies
Wrong Approach: Assessing controls in isolation
Correct Approach:
- Consider control combinations (defense in depth)
- Recognize when one control depends on another
- Adjust effectiveness if dependencies fail
Example: IDS effectiveness depends on network monitoring tools being operational.
Mistake 5: Using Arbitrary or Inconsistent Scales
Wrong Approach: Different assessors using different criteria
Correct Approach:
- Define and document scoring criteria
- Provide clear examples for each level
- Train all risk assessors
- Calibrate assessments across the organization
Best Practice: Create a risk assessment guide with specific examples for your industry.
Mistake 6: Calculating Risk Once and Never Updating
Wrong Approach: Treating risk calculation as a one-time exercise
Correct Approach:
- Recalculate after control changes
- Review quarterly or when threat landscape changes
- Update after incidents
- Track risk trends over time
Trigger Events:
- New vulnerabilities discovered
- Control failures
- Organizational changes
- Regulatory updates
Mistake 7: Neglecting Cumulative Risk
Wrong Approach: Assessing each risk in isolation
Correct Approach:
- Consider risk aggregation across assets
- Identify common vulnerabilities
- Assess cascading failures
- Calculate overall organizational risk exposure
Example: 10 "Low" risks affecting the same critical system may collectively represent a "High" risk.
Mistake 8: Mathematical Precision Without Practical Judgment
Wrong Approach: Blindly following formulas without context
Correct Approach:
- Use calculation as a starting point
- Apply expert judgment
- Consider qualitative factors
- Adjust based on organizational context
Remember: Risk management is both science and art.
Risk Register Integration
Recording Calculations in the Risk Register
Each risk in your register should include:
| Field | Example Value |
|---|---|
| Risk ID | RISK-2024-045 |
| Asset | Customer Database |
| Threat | Unauthorized Access |
| Vulnerability | Weak Password Policy |
| Inherent Likelihood | 4 (Likely) |
| Inherent Impact | 5 (Catastrophic) |
| Inherent Risk Score | 20 (Extreme) |
| Existing Controls | MFA, Password Policy, Access Logs, Account Lockout, IDS |
| Control Effectiveness | High/Medium/Medium/High/Medium |
| Residual Likelihood | 2 (Unlikely) |
| Residual Impact | 4 (Major) |
| Residual Risk Score | 8 (Low) |
| Risk Treatment | Monitor - Review quarterly |
| Calculation Date | 2024-11-15 |
| Next Review Date | 2025-02-15 |
| Calculated By | Jane Smith, CISO |
Risk Score Trends
Track risk scores over time:
Risk ID: RISK-2024-045
Date | Inherent | Residual | Status | Actions Taken
------------|----------|----------|--------|---------------
2024-01-15 | 20 | 16 | High | Initial assessment
2024-04-15 | 20 | 12 | Medium | Implemented MFA
2024-07-15 | 20 | 10 | Medium | Enhanced logging
2024-11-15 | 20 | 8 | Low | Added IDS
2025-02-15 | TBD | TBD | TBD | Quarterly review due
Risk Calculation Audit Trail
Maintain documentation:
Risk Calculation Worksheet - RISK-2024-045
Date: 2024-11-15
Assessor: Jane Smith, CISO
Reviewer: John Doe, IT Manager
Inherent Risk Calculation:
- Likelihood Rationale: Phishing attacks targeting similar organizations
occur 5-10 times per year. Weak passwords make exploitation likely.
Industry data shows 65% probability within 12 months.
Score: 4 (Likely)
- Impact Rationale: Database contains 100K customer records with PII.
GDPR penalties up to €20M. Estimated financial impact: $800K.
Reputational damage: Major media coverage expected.
Score: 5 (Catastrophic)
- Inherent Risk: 4 × 5 = 20 (Extreme)
Control Effectiveness Assessment:
[Detailed control evaluation as shown in Step 6]
Residual Risk Calculation:
[Detailed residual calculation as shown in Steps 7-9]
Approval:
- Risk Owner: Sarah Johnson (Approved: 2024-11-16)
- CISO: Jane Smith (Approved: 2024-11-16)
- CIO: Mike Davis (Approved: 2024-11-17)
Alternative Risk Calculation Methods
Qualitative Risk Assessment (No Numbers)
Instead of numeric scores, use descriptive categories:
Risk: Unauthorized database access
Likelihood: Likely (based on threat intelligence)
Impact: Catastrophic (based on data sensitivity)
Risk Level: Extreme (requires immediate action)
Advantages:
- Simpler for non-technical stakeholders
- Less false precision
- Faster assessment
Disadvantages:
- Harder to prioritize similar risks
- Less objective
- Difficult to track trends
Quantitative Risk Assessment (Financial)
Calculate in monetary terms:
Annual Loss Expectancy (ALE) = Annual Rate of Occurrence (ARO) × Single Loss Expectancy (SLE)
Example:
- ARO: 0.4 (once every 2.5 years)
- SLE: $800,000 (cost per incident)
- ALE: 0.4 × $800,000 = $320,000
Compare ALE against control costs to determine ROI
Advantages:
- Enables cost-benefit analysis
- Financial terms familiar to executives
- Supports budget justification
Disadvantages:
- Requires extensive data
- Time-consuming
- Difficult to quantify reputational impact
Hybrid Approach (Recommended)
Combine qualitative and quantitative:
- Use qualitative for initial screening
- Apply semi-quantitative (5×5 matrix) for most risks
- Perform detailed quantitative analysis for top risks
Risk Calculation Tools and Templates
Spreadsheet Template
Create an Excel/Google Sheets calculator:
Columns:
A: Risk ID
B: Risk Description
C: Inherent Likelihood (1-5)
D: Inherent Impact (1-5)
E: Inherent Risk (=C*D)
F: Inherent Rating (=IF(E>=20,"Extreme",IF(E>=15,"High",IF(E>=10,"Medium",IF(E>=5,"Low","Very Low")))))
G: Control 1 Effectiveness (0-3)
H: Control 2 Effectiveness (0-3)
I: Control 3 Effectiveness (0-3)
J: Total Control Effectiveness (=SUM(G:I))
K: Residual Likelihood (=MAX(1,C-FLOOR(J/2)))
L: Residual Impact (=MAX(1,D-FLOOR(J/3)))
M: Residual Risk (=K*L)
N: Residual Rating (same formula as F)
O: Risk Reduction (=E-M)
P: Treatment Decision
Risk Heat Map
Visual representation of risk scores:
IMPACT
1 2 3 4 5
5 [5] [10] [15] [20] [25]
L 4 [4] [8] [12] [16] [20]
I 3 [3] [6] [9] [12] [15]
K 2 [2] [4] [6] [8] [10]
E 1 [1] [2] [3] [4] [5]
L
I
H
O
O
D
Plot inherent and residual risks to visualize treatment effectiveness.
Risk Calculation Review and Validation
Peer Review Process
- Initial Assessment: Risk owner performs calculation
- Peer Review: Another risk assessor validates methodology
- Expert Review: Subject matter expert reviews technical assumptions
- Management Review: Senior management approves risk ratings
- Audit Validation: Internal audit samples calculations
Calibration Sessions
Conduct quarterly calibration meetings:
- Review risk assessments as a group
- Discuss edge cases and borderline ratings
- Ensure consistency across departments
- Update assessment criteria based on lessons learned
Key Validation Questions
- Is the risk scenario clearly defined?
- Are likelihood and impact assessments supported by evidence?
- Have all relevant controls been identified?
- Is control effectiveness properly tested and documented?
- Are calculations mathematically correct?
- Do residual risk levels align with management judgment?
- Are treatment decisions appropriate for risk levels?
Practical Exercise: Calculate Your Own Risk
Step-by-Step Worksheet
Use this template to calculate a risk from your organization:
RISK CALCULATION WORKSHEET
1. Risk Identification
Asset: _________________________________
Threat: _________________________________
Vulnerability: _________________________________
Risk Scenario: _________________________________
2. Inherent Risk Assessment
Likelihood Score (1-5): _____
Likelihood Justification: _________________________________
Impact Score (1-5): _____
Impact Justification: _________________________________
Inherent Risk Score: _____ × _____ = _____
Inherent Risk Rating: _________________________________
3. Existing Controls
Control 1: _________________ Effectiveness (0-3): _____
Control 2: _________________ Effectiveness (0-3): _____
Control 3: _________________ Effectiveness (0-3): _____
Control 4: _________________ Effectiveness (0-3): _____
4. Residual Risk Assessment
Residual Likelihood Score (1-5): _____
Reduction Justification: _________________________________
Residual Impact Score (1-5): _____
Reduction Justification: _________________________________
Residual Risk Score: _____ × _____ = _____
Residual Risk Rating: _________________________________
5. Treatment Decision
Is residual risk acceptable? YES / NO
Treatment option selected: _________________________________
Additional controls planned: _________________________________
Target residual risk: _____
6. Approval
Risk Owner: _________________ Date: _____
Assessor: _________________ Date: _____
Reviewer: _________________ Date: _____
Key Takeaways
-
Risk Calculation is Systematic: Follow a consistent, documented process for all risk assessments
-
Inherent vs Residual Risk: Always calculate both to understand true risk exposure and control effectiveness
-
Control Effectiveness Matters: Don't assume controls are 100% effective—test and validate
-
Documentation is Critical: Maintain clear audit trails of calculations, assumptions, and decisions
-
Risk is Dynamic: Recalculate regularly as threats, vulnerabilities, and controls change
-
Judgment Over Formula: Use calculations as input to informed decision-making, not as absolute truth
-
Consistency is Key: Apply scoring criteria consistently across the organization
-
Integration with Risk Register: Ensure calculations feed directly into your risk management process
Common Questions
Q: How often should we recalculate risks? A: Minimum quarterly for all risks, immediately after significant changes (new threats, control failures, incidents, organizational changes).
Q: What if different assessors calculate different scores for the same risk? A: This indicates a need for better criteria definition, training, and calibration. Conduct calibration sessions to align understanding.
Q: Should we calculate risk for every asset? A: Focus on critical assets first (those supporting critical business processes or containing sensitive data). Use asset classification to prioritize.
Q: How precise should our calculations be? A: Avoid false precision. Risk assessment is inherently subjective. Focus on getting the risk level (Extreme/High/Medium/Low) right, not the exact score.
Q: Can we use different scales (e.g., 3×3 or 7×7)? A: Yes, but 5×5 is most common and provides good balance between granularity and simplicity. Choose one and use it consistently.
Q: What if our residual risk is still too high after all controls? A: This indicates additional treatment is needed. Explore additional controls, risk transfer (insurance), or risk avoidance (changing the process).
Next Lesson: In Lesson 3.6, you'll learn how to create a comprehensive Risk Register template to document and track all your risk calculations, treatments, and monitoring activities in a centralized system.