Learn to manage Corrective & Preventive Actions from creation to closure
What is CAPA?
CAPA stands for Corrective Action and Preventive Action.
It's a systematic approach to investigating problems, implementing solutions, and preventing recurrence.
CAPA is required by most quality standards (ISO 9001, FDA 21 CFR Part 820, ISO 13485) and is fundamental
to continuous improvement.
✓ Corrective Action (CA)
React to problems that occurred
• Problem happened
• Find root cause
• Fix it permanently
• Verify effectiveness Example: Customer found defect → investigate → fix process
✓ Preventive Action (PA)
Prevent problems before they occur
• Identify risk/potential issue
• Eliminate risk
• Prevent occurrence
• Monitor effectiveness Example: Similar process fails → fix ours before it fails too
Why CAPA Matters
Organizations with strong CAPA systems see:
• 60-80% reduction in recurring nonconformances
• 30-50% decrease in customer complaints
• Faster resolution - days instead of months
• Regulatory compliance - FDA, ISO, customer audits
• Continuous improvement culture - proactive problem prevention
• Same problems repeat
• No follow-through
• Lost action items
• Firefighting mode
• Failed audits
• Reactive culture
Step 1 of 6
The CAPA Workflow
Understanding the complete CAPA lifecycle
CAPA Process Flow
From identification to verified closure
1
Identify
Problem detected
→
2
Investigate
Root cause analysis
→
3
Plan
Define solution
→
4
Implement
Execute action
→
5
Verify
Check effectiveness
→
6
Close
Document & archive
1
IDENTIFY - Problem Detection
Triggers: Customer complaint, audit finding, internal quality issue, near-miss, trend analysis, risk assessment Action: Create CAPA record, assign owner, set priority Time: Same day as detection
2
INVESTIGATE - Root Cause Analysis
Tools: 5 Whys, Fishbone diagram, data analysis, gemba observation Action: Document root cause with evidence, not assumptions Time: 5-15 days depending on complexity
3
PLAN - Solution Development
Define: Corrective/preventive actions, resources needed, timeline, success criteria Action: Get approval for plan, assign responsibilities Time: 3-7 days
4
IMPLEMENT - Execute Actions
Do: Process changes, training, system updates, documentation Action: Update CAPA with implementation evidence (photos, records, signatures) Time: 15-90 days depending on scope
5
VERIFY - Effectiveness Check
Monitor: Collect data over time (usually 30-90 days) to confirm problem solved Action: Document verification results, metrics, evidence of sustained improvement Time: 30-90 days monitoring period
6
CLOSE - Completion & Documentation
Finalize: Complete all documentation, lessons learned, standardize solution Action: Management review, formal closure, share learnings Time: 1-3 days
⚠️ Common CAPA Workflow Failures
1. Skipping Root Cause Investigation: Jumping straight to solution without understanding "why" → Same problem returns
2. No Verification Step: Assuming action worked without checking → Ineffective solutions go undetected
3. Premature Closure: Closing before confirming sustained improvement → Problem recurs after closure
4. Weak Action Plans: Vague actions like "retrain staff" without specifics → No one knows what to do
Step 2 of 6
CAPA Prioritization
Not all CAPAs are equal - prioritize for impact
Priority Classification System
How to assign priority levels effectively
Priority determines urgency and resources allocated to a CAPA. Use the SQDC framework
(Safety, Quality, Delivery, Cost) to assess impact. Higher priority = faster response, more resources, management attention.
Response Time: Start within 24 hours Target Closure: 15-30 days Approval: Senior management required
Example: FDA warning letter, employee injury, customer recall notice
🟠 High
Definition: Significant quality impact, major customer complaint, audit finding, production stoppage
Response Time: Start within 3 days Target Closure: 30-60 days Approval: Department manager
Example: Customer rejection, repeated defects, failed audit, material shortage
🟡 Medium
Definition: Moderate impact on quality/delivery, minor customer concern, process improvement opportunity
Response Time: Start within 7 days Target Closure: 60-90 days Approval: Supervisor
Example: Internal quality escape, delivery delay, process variation, documentation gap
🔵 Low
Definition: Minor improvement, cosmetic issue, efficiency opportunity, no immediate impact
Response Time: Start within 14 days Target Closure: 90-120 days Approval: CAPA coordinator
Example: Filing system improvement, label clarity, minor aesthetic issue
Priority Decision Matrix
Answer these questions to determine priority:
SAFETY: Could this cause injury or fatality? (Yes = Critical/High)
REGULATORY: Is this a violation of law/regulation? (Yes = Critical)
CUSTOMER: Does customer know or will they be impacted? (Yes = High/Critical)
BUSINESS: Will this stop production or lose major revenue? (Yes = High)
RECURRING: Has this happened before? (Yes = Increase priority by 1 level)
SCOPE: Does it affect multiple products/customers? (Yes = Increase priority)
Pro Tip: The "Red Button Test"
Ask: "If this problem happened right now, would I hit the emergency stop button?"
• If YES → Critical or High priority
• If "I'd investigate first" → Medium priority
• If "I'd note it for later" → Low priority
This gut-check helps cut through analysis paralysis on priority decisions.
Step 3 of 6
Creating Effective CAPAs
How to write clear, actionable CAPA records
CAPA Creation Best Practices
The anatomy of a well-written CAPA
A good CAPA is SMART: Specific, Measurable, Achievable, Relevant, Time-bound.
Poor CAPAs lead to confusion, delays, and ineffective solutions. Follow this structure:
Interactive CAPA Form
Practice filling out a CAPA for this scenario: "Customer received 50 units with incorrect labeling"
Problem: "47 units rejected for scratches, Line 3, March 15, 2:00-4:00 PM" Root Cause: "Fixture padding missing - not in maint checklist" Action: "1) Add padding replacement to checklist 2) Inspect all fixtures 3) Train 3 techs on new procedure" Owner: "J. Smith, Maint Supervisor" Due: "March 30, 2026"
⚠️ Avoid These Common CAPA Mistakes
1. "Operator error" as root cause: This blames people, not systems. Ask WHY error was possible. 2. "Retrain" as corrective action: Too vague. Train on what? When? How verify they learned? 3. Multiple unrelated actions: One CAPA should address one problem. Don't bundle unrelated issues. 4. No verification method: How will you confirm it worked? Define metrics upfront. 5. Unrealistic timelines: 3-day target for complex system change = guaranteed to miss deadline.
Step 4 of 6
Root Cause Analysis
Finding the true cause — not the symptom — is what separates effective CAPAs from failed ones
Why Root Cause Analysis is Non-Negotiable
Skip this step and the problem will return
Studies show that over 70% of CAPA failures occur because teams implemented solutions to symptoms rather than root causes.
Root Cause Analysis (RCA) is the structured investigation to determine the deepest underlying reason a problem occurred — the point where a different action would have prevented it entirely.
✗ Symptom Fix (Fails)
Problem: Machine breaks down repeatedly. Symptom fix: Repair the machine each time. Result: Machine breaks down again in 3 weeks. CAPA closed prematurely, same issue re-opened 4 times in 12 months.
✓ Root Cause Fix (Succeeds)
Problem: Machine breaks down repeatedly. Root cause: Lubrication interval in maintenance schedule is 6 months — but manufacturer requires 3 months under current load conditions. Fix: Update schedule to 3-month intervals. Zero breakdowns in following 12 months.
The 5 Whys Technique
Ask "Why?" repeatedly until you reach a systemic cause you can actually fix. Developed by Sakichi Toyoda, used throughout Toyota Production System.
Problem: Customer received a batch of units with wrong expiry date labels.
Why 1: Operator printed from the wrong label template. → Why?
Why 2: The template selection screen shows 14 templates with similar names. → Why?
Why 3: Templates were added over time with no naming standard. → Why?
Why 4: No template naming procedure exists in the quality system. → Why?
Root Cause (Why 5): Software change control process does not require template naming convention review. This is a system gap.
✓ Countermeasure targets the root: Update software change control to mandate naming convention review. Rename all 14 templates. Add barcode-scan verification before printing.
5 Whys Rules
1. Use real data at each step — each answer must be supported by evidence, not assumption. 2. Involve the people who do the work — they know the actual conditions. 3. Don't stop at "human error" — always ask why the human error was possible. 4. It doesn't have to be exactly 5 — stop when you reach a cause the organisation can actually control. 5. Verify the root cause — confirm that fixing it would have prevented the original problem.
Fishbone (Ishikawa) Diagram for CAPA
The fishbone diagram organises potential causes by category before drilling down with 5 Whys. Use it when the root cause is unclear and multiple departments may be contributing.
Write the problem at the "head." Brainstorm causes under each category bone. Then use voting or data to prioritise the top 1–2 causes for 5 Whys investigation.
People
Training, experience, fatigue, staffing, supervision, communication
Process
Procedures, SOPs, work instructions, approvals, handoffs, controls
Is the root cause stated as a system gap, not a person's fault?
Is each "Why" step supported by data or direct observation — not assumption?
Were the people who do the actual work involved in the investigation?
Does fixing this root cause logically prevent the original problem from recurring?
Has a similar analysis been done for similar processes to prevent the same issue elsewhere?
Is the root cause within the organisation's control to fix?
⚠️ The #1 RCA Mistake
"Operator did not follow procedure" is not a root cause — it is a symptom.
The root cause is always: why was it possible for the operator to not follow the procedure without detection?
Ask: Was the procedure unclear? Not available at point of use? Not trained on? No consequence for deviation?
Each of those is a system failure that can be fixed — and will prevent the same error from other operators too.
Case Study: Pharmaceutical Mislabelling
Problem
During a routine audit, 3 batches of finished product (6,000 units) were found with incorrect strength on the label. Batches already shipped to 2 distributors.
5 Whys Investigation (condensed)
Why 1: Label template was updated 6 months ago for a new strength — old template not archived.
Why 2: Both templates remained selectable in the system with similar file names.
Why 3: No archive/retire step exists in the document change control SOP. Root Cause: Document change control SOP does not require obsolete templates to be retired from the active selection list.
Corrective Action
Immediate recall and relabelling of affected batches. Temporary manual check added to labelling line pending system fix.
Preventive Action
Updated document change control SOP to include mandatory template retirement step. Added system-level control: only templates in "Current" status are selectable. Rolled out to all 7 product lines.
Step 5 of 6
Verification & Closure
Confirming your fix actually worked — and locking in the gain permanently
Why Verification is the Most Critical Step
Most CAPA systems fail here — actions are taken but effectiveness is never confirmed
Verification is the structured process of confirming, using objective evidence, that the corrective and preventive actions:
(1) were actually implemented as planned, and
(2) have effectively resolved the original problem — and that the problem has not recurred.
Without this, you have activity, not improvement.
✗ False Closure
Actions were completed on paper. Trainer signed the training record. Process was "updated."
30 days later: Same defect type appears again. CAPA re-opened. Audit finding raised for ineffective CAPA system.
Root cause of false closure: Verification criteria were never defined before the CAPA was started.
✓ Effective Verification
Before implementation, team agreed: "Zero labelling errors in 100 consecutive units inspected over 60 days."
Day 60: 100% accuracy confirmed via daily log. Control chart shows process stable within limits.
CAPA closed with attached evidence. Audit review: effective. No recurrence in 12 months.
Verification Evidence Requirements
Verification must be objective — it cannot rely on the person who implemented the action saying "it's fixed." Accepted forms of evidence:
Data Evidence
Control charts showing stable, improved process Defect rate trending down over 60–90 days Before/after statistical comparison Zero recurrence of the specific defect type KPI trend data over monitoring period
Document Evidence
Updated SOPs with revision date and approvals Training records with signatures and dates Completed inspection/audit checklist Photographs of physical changes (before/after) System screenshots showing configuration change
Audit Evidence
Independent walkthrough confirming change is sustained Process audit results post-implementation Second-party or third-party audit confirmation Customer acceptance / no further complaints Management review sign-off
⏱️ Time-Based Evidence
30 days: Initial check — action in place, early results 60 days: Trend confirmation — sustained improvement 90 days: Full verification window for most CAPAs 6–12 months: Long-term monitoring for Critical CAPAs Regulatory CAPAs: may require 12-month monitoring
The Closure Process
1
Implementation Review
Confirm all planned actions were completed. Collect implementation evidence (records, photos, updated documents). Ensure nothing was partially completed or substituted without approval. Gate: 100% of planned actions must be evidenced before entering verification period.
2
Monitoring Period (30–90 days)
Collect data against the success criteria defined at CAPA creation. Monitor for recurrence of the original problem. Capture any related issues that emerge. Update the CAPA record with monitoring data at each checkpoint. Do not close early — regulators and auditors specifically check that monitoring periods were completed.
3
Effectiveness Determination
Compare monitoring data against success criteria. Three outcomes are possible: Effective: All criteria met — proceed to closure. Partially Effective: Some criteria met — extend monitoring or add supplemental action. Ineffective: Root cause was wrong or action insufficient — reopen RCA, do not close.
4
Lessons Learned & Knowledge Transfer
Before closing, ask: "Does this issue exist, or could exist, in other products, processes, or sites?"
Document lessons learned in a format that can be searched later. Share with relevant teams. Update risk registers if applicable. This step is regulatory requirement under ISO 9001 Cl.10 and FDA 21 CFR Part 820.100.
5
Formal Closure
Management review and sign-off. All required fields completed. Evidence package attached. CAPA status set to "Closed — Effective" in the system. Summary added to management review agenda.
Verification Checklist — Gate Before Closure
All planned actions have been completed and evidenced with records
Monitoring period has been completed (minimum 30 days, 90 days for regulated environments)
Success criteria defined at CAPA creation have been met with objective data
The original problem has not recurred during the monitoring period
Affected SOPs, work instructions, and training materials have been updated
All affected personnel have been trained and training documented
Lessons learned captured and shared with relevant teams
Similar risk assessed for other products, processes, or sites
Management review sign-off obtained
CAPA system record fully completed, evidence attached, status updated
⚠️ Regulatory Requirement: Verification of Effectiveness
FDA 21 CFR Part 820.100(a)(7) and ISO 13485 Clause 8.5.2 both explicitly require that organisations
verify the effectiveness of corrective actions — not just that they were implemented.
In FDA inspections, "lack of effectiveness verification" is consistently in the top 5 CAPA deficiencies cited.
Auditors look for: Who defined success criteria? When? What data was collected? Who reviewed it?
If these questions can't be answered from the CAPA record alone, it is an audit finding.
When to Re-open vs. Close a CAPA
Close as Effective: Original problem solved, success criteria met, no recurrence, systemic fix in place.
Close as Partially Effective, Open Follow-Up: Main issue resolved but one action incomplete. Open a new CAPA for the outstanding item rather than keeping the original open indefinitely.
Re-open / Re-investigate: Original problem recurred during monitoring. Root cause was incorrect. Go back to RCA with new data.
Never close under time pressure alone. "The due date has passed" is not a reason to close. It is a reason to escalate.
Step 6 of 6
CAPA in Practice
Managing CAPAs across the organisation — metrics, common pitfalls, and a complete case study
CAPA System Health Metrics
What to measure to know your CAPA system is working
A CAPA system that creates records but never drives real improvement is a liability, not an asset. These metrics reveal system health and are reviewed at management review.
ON-TIME CLOSURE RATE
Target: ≥ 85% CAPAs closed by due date ÷ total CAPAs due. Below 85% signals under-resourcing, unrealistic timelines, or lack of management attention.
RECURRENCE RATE
Target: < 10% CAPAs reopened for same issue ÷ total closed CAPAs. High recurrence = root cause analysis is failing. The most important metric.
AVERAGE CYCLE TIME
Target: by priority class Critical: ≤30d | High: ≤60d | Medium: ≤90d | Low: ≤120d. Trend over time shows whether the system is speeding up or getting slower.
OVERDUE CAPA COUNT
Target: 0 Critical/High overdue Any Critical or High CAPA past due date requires immediate management escalation. This is an audit finding waiting to happen.
The Most Common CAPA System Failures
CAPA Factory: Opening CAPAs for everything creates volume that overwhelms the team. Reserve CAPAs for systemic issues — use informal correction for one-off events.
No Management Ownership: CAPAs owned by QA only, not process owners. Sustainable fixes require the person who controls the process to own the CAPA.
Success Criteria Defined After Closure: Criteria must be defined when the CAPA is opened — not after implementation when confirmation bias takes over.
Copy-Paste RCA: Reusing root causes from similar CAPAs without investigating the current incident. Each problem may have a different root even if symptoms look alike.
Treating Verification as a Formality: Collecting evidence without actually analysing it. "Collected 30 data points" ≠ verification. Were they within spec? Did the trend improve?
Lack of Horizontal Deployment: Fixing a problem on Line A without checking Lines B and C. Preventive action should scan all similar processes.
Complete End-to-End CAPA Case Study
Trigger
Customer complaint #CC-0342: Hospital pharmacy reported that 3 infusion pumps from batch 2026-03 were delivered without the mandatory safety checklist documentation. Risk: pumps could be activated without completing pre-use safety verification.
Priority Classification
Critical — patient safety risk + regulatory (FDA 21 CFR Part 820) + customer halt. Escalated to VP Quality within 2 hours of complaint receipt.
Immediate Containment
All 3 units placed on hold at customer site pending documentation supply. Remaining batch 2026-03 inventory (47 units) quarantined pending documentation audit. Customer notified with interim safety instructions.
Root Cause (5 Whys)
Why 1: Safety checklist was not packed with the 3 units.
Why 2: Packing station operator did not receive the checklist kit from Document Control.
Why 3: Document Control's daily batch kit preparation checklist was revised 2 weeks prior — checklist kits were removed from the list during a "streamlining" exercise. Root Cause: Change was made to Document Control work instruction without formal change control review. The safety implication of removing checklist kit preparation was not assessed.
Corrective Actions
1. Supply missing checklist documentation to hospital within 48 hours (completed Day 1).
2. Audit all outbound shipments from past 30 days for missing documentation (completed Day 5, 2 additional cases found and remediated).
3. Restore checklist kit preparation to Document Control daily work instruction (completed Day 3).
Preventive Actions
1. Updated change control SOP: any change to packing or documentation procedures requires QA safety impact assessment before implementation.
2. Added outbound packing audit step: 100% packing list verification against customer documentation requirements for first 60 days, then sampled.
3. Horizontal deployment: reviewed Document Control work instructions for all 4 other product lines — identified and corrected 1 similar gap.
Verification (90-day monitoring)
Day 30: 100% packing audit — zero documentation gaps in 312 units shipped.
Day 60: Change control SOP update confirmed in use for 2 subsequent changes — both included QA safety assessment.
Day 90: Control chart for documentation compliance at 100% for 90 consecutive days. CAPA verified effective. Closed.
CAPA in Regulatory Context
ISO 9001:2015 Clauses 10.2: Requires corrective action to eliminate causes of nonconformities proportional to the effects. Results must be documented.
FDA 21 CFR Part 820.100: Requires written procedures for CAPA, identification of existing and potential causes, verification of effectiveness — specific to medical devices.
ISO 13485:2016 Clause 8.5.2: Extends ISO 9001 requirements with additional controls for regulated industries. Requires feedback into risk management.
AS9100 (Aerospace): Requires root cause analysis and determination of whether similar issues exist in other processes or products.
Knowledge Test — 12 Questions
Test Your CAPA Knowledge
Answer all 12 questions to receive your completion certificate. Immediate feedback on every answer.
Select your answer for each question. Correct answers are shown immediately with an explanation. Complete all 12 to see your score and certificate.
Question 1 of 12 — CAPA Fundamentals
A machine has been repaired three times in 6 months for the same breakdown. What does this indicate about the CAPA approach being used?
A
The corrective actions have been effective — three repairs proves consistent maintenance
B
The corrective actions are addressing symptoms, not the root cause — the recurrence proves ineffectiveness
C
The machine needs to be replaced, not repaired — CAPA is not the right tool
D
Three repairs in 6 months is within normal tolerance and no CAPA is required
✓ Recurrence of the same problem is the definitive indicator that prior corrective actions addressed symptoms rather than root causes. Each repair "reset" the clock but changed nothing systemic. A proper CAPA would have investigated why the machine breaks down (lubrication, overload, maintenance schedule gap) and addressed that cause — making the breakdown mechanically impossible or predictably prevented.
Question 2 of 12 — CA vs PA
Your competitor has recalled a product due to a software defect. Your product uses similar software architecture. You initiate an investigation proactively. This is an example of:
A
Corrective Action — you are correcting a known problem
B
Preventive Action — you are acting before a problem has occurred in your product
C
Neither — this is a competitive intelligence activity, not a CAPA
D
Corrective Action — because a problem has occurred, even if not in your product
✓ Preventive Action (PA) is triggered by the identification of a potential cause of nonconformity — before a problem actually occurs in your own process. Using a competitor's recall as a signal to proactively assess your own similar system is a textbook PA trigger. This is exactly what the PA mechanism is designed to capture: industry signals, audit findings elsewhere, near-misses, and trend data.
Question 3 of 12 — Prioritisation
An internal audit finds that calibration records for 3 measuring instruments are missing for the past 60 days. Product shipped during this period cannot be recalled as it has been consumed. What priority should this CAPA receive?
A
Low — the product has already been consumed so there is no ongoing risk
B
Medium — calibration records are administrative, not a safety issue
C
Critical or High — this is a regulatory compliance violation (ISO 9001, 21 CFR) with potential product quality impact and audit risk
D
No CAPA needed — document the finding and move on
✓ Missing calibration records is a regulatory violation under ISO 9001 Clause 7.1.5 and FDA 21 CFR Part 820.72. It raises questions about product quality released during that period, is an automatic audit finding, and must be treated with urgency. Even if the immediate product risk is low, the systemic failure (no controls to prevent calibration lapse) must be addressed before the next audit — and potentially before the next shipment.
Question 4 of 12 — CAPA Workflow
A team has completed root cause analysis, developed an action plan, and begun implementation. A senior manager asks them to close the CAPA now as the team is busy. What should the CAPA coordinator do?
A
Close the CAPA as requested — management authority supersedes the process
B
Open a new CAPA for the verification step
C
Decline to close — explain that closing before verification is a regulatory non-compliance that creates greater audit risk than the delay
D
Close but add a note that verification is pending
✓ Closing a CAPA before the verification period is complete is a regulatory non-conformance under ISO 9001, ISO 13485, and FDA 21 CFR Part 820. The risk of an audit finding for "premature closure" and "lack of effectiveness verification" is significantly higher than the risk of having an open CAPA. The coordinator should escalate the manager's request, explain the regulatory position, and offer to review whether resources can be better allocated rather than cutting the verification step.
Question 5 of 12 — Root Cause Analysis
After applying 5 Whys, the team concludes: "Root cause: Operator did not follow the labelling procedure." What is wrong with this root cause statement?
A
Nothing — human error is a valid root cause and operator retraining is the correct fix
B
It identifies a symptom, not a root cause — it doesn't explain why the deviation was possible and went undetected
C
It's too specific — root cause should be broader to cover all possible scenarios
D
5 Whys is the wrong tool — a fishbone diagram should have been used instead
✓ "Operator did not follow procedure" is a behaviour description, not a root cause. It stops at the human level without asking: Was the procedure clear? Available at the point of use? Last updated? Was compliance monitored? Was there a detection mechanism? The root cause is always the systemic condition that allowed the deviation to happen and go undetected — not the fact that a person made a mistake.
Question 6 of 12 — Writing Effective CAPAs
Which of the following is the best-written corrective action?
A
"Improve the labelling process to prevent future errors."
B
"Retrain all operators on labelling procedures by end of quarter."
C
"By 15 April 2026, J. Smith (QA) will: 1) Rename all 14 label templates to PRODUCT-STRENGTH-DATE format; 2) Archive obsolete templates; 3) Add barcode scan verification step to labelling SOP Rev 4; 4) Train 6 operators with competency sign-off. Zero labelling errors in 90-day post-implementation audit = verified effective."
D
"Investigate the labelling issue and implement appropriate corrective actions."
✓ Option C is the only SMART corrective action: Specific (exact actions listed), Measurable (90-day audit with zero errors), Achievable (defined scope), Relevant (directly addresses root cause), Time-bound (15 April 2026). Options A and D are meaningless — no one knows what to do. Option B is incomplete — training alone rarely addresses systemic issues, and "end of quarter" is vague. Verification criteria must be defined upfront.
Question 7 of 12 — Verification
After 90 days, a team reviews their CAPA and finds that training was completed and the SOP was updated, but the defect they were targeting has occurred twice more during the monitoring period. What should happen?
A
Close the CAPA as the actions were completed — the recurrences are a separate issue
B
Extend the monitoring period by another 90 days to collect more data
C
Mark as Ineffective — reopen root cause analysis with the new recurrence data as the root cause was wrong or actions were insufficient
D
Close as partially effective and open a new CAPA for the two recurrences
✓ Recurrence during the monitoring period definitively indicates the CAPA was ineffective. The root cause analysis either identified the wrong cause, or the actions were insufficient to address it. The correct response is to mark the CAPA Ineffective, return to the Investigate step with the new data (which is now more informative), and conduct a deeper root cause investigation. Extending monitoring without new action is just collecting more evidence of the same failure.
Question 8 of 12 — 5 Whys
During a 5 Whys investigation, the team reaches Why 3 and identifies a clear systemic fix. Should they continue to Why 4 and Why 5?
A
Yes — you must always ask exactly 5 Whys, no more, no less
B
Only continue if the Why 3 cause is itself caused by something the organisation can control. Stop at the deepest cause that can be addressed.
C
No — stop at Why 3, as more Whys introduce speculation
D
Yes — always go to Why 7 or Why 8 for serious CAPAs
✓ The "5" in 5 Whys is a guideline, not a rule. Stop when you reach a cause that: (a) the organisation can actually control and fix, and (b) fixing it would prevent the original problem from recurring. If Why 3 meets both criteria, stop there. If Why 3 is itself caused by something structural that can be improved, continue deeper. Going deeper than necessary into uncontrollable systemic causes wastes time and produces untestable countermeasures.
Question 9 of 12 — Verification Evidence
Which of the following is the strongest form of verification evidence for a CAPA that reduced a welding defect rate?
A
Signed training records confirming all welders completed the updated procedure training
B
The weld supervisor's written statement that "the process has improved noticeably"
C
Updated SOP showing the new welding parameters
D
90-day control chart showing defect rate fell from 8.2% to 0.4% with stable process performance since implementation
✓ Only option D demonstrates that the problem actually improved. Training records prove training happened — not that it worked. A supervisor's statement is subjective. An updated SOP proves the document changed. But a control chart with 90 days of objective data showing the defect rate dropped to near-zero and stayed there is direct, quantitative evidence that the corrective action was effective. Always measure outcome, not activity.
Question 10 of 12 — CAPA Metrics
A company's CAPA system shows: 94% on-time closure rate, average cycle time of 28 days, but 35% recurrence rate. What does this tell you about the CAPA system?
A
The system is excellent — 94% on-time is world-class performance
B
The system is good but could improve on cycle time
C
The system is fundamentally broken — 35% recurrence means CAPAs are being closed quickly without solving underlying problems
D
The 35% recurrence is acceptable — some problems are too complex to solve permanently
✓ A 35% recurrence rate is catastrophic regardless of how fast CAPAs are closed. It means 1 in 3 "solved" problems comes back — the organisation is doing rework on its own improvement system. A 94% on-time closure rate with 35% recurrence means CAPAs are being rushed to closure without adequate root cause analysis or verification. Speed of closure means nothing without effectiveness. Recurrence rate is the single most important CAPA metric.
Question 11 of 12 — Regulatory Requirements
Under FDA 21 CFR Part 820 and ISO 13485, what is the specific regulatory requirement that most CAPA systems fail to adequately demonstrate during inspections?
A
That all CAPAs were completed within the target timeframe
B
That CAPAs were reviewed at management review meetings
C
Verification of effectiveness — that objective evidence confirms the actions actually solved the problem
D
That root cause analysis used a recognised tool such as 5 Whys or fishbone
✓ "Lack of effectiveness verification" is consistently among the top 5 FDA CAPA deficiencies cited in 483 observations and Warning Letters. The regulation explicitly requires that corrective actions are verified for effectiveness — not just documented as complete. Auditors look for: pre-defined success criteria, objective data collected during monitoring, and a documented determination of whether criteria were met. If these cannot be shown, it is a finding.
Question 12 of 12 — Preventive Deployment
A CAPA has been successfully closed for a labelling error on Product Line A. What final action should be taken before archiving?
A
Archive immediately — the CAPA is closed and no further action is needed
C
Notify the customer that the CAPA has been closed successfully
C
Assess whether the same root cause exists on Product Lines B, C, and D — and implement preventive actions there before a similar problem occurs
D
Publish the lessons learned document and then archive
✓ Horizontal deployment — checking whether the same root cause exists in similar processes — is a regulatory requirement under ISO 9001 Clause 10.2 and AS9100, and a quality system best practice. Fixing Line A while Lines B, C, and D remain vulnerable is an incomplete CAPA. The standard explicitly asks: "Are there similar nonconformities, or could there be?" Lessons learned documents alone do not satisfy this — specific assessment and action on similar processes is required.
for successfully completing the CAPA Mastery Guide,
demonstrating comprehensive knowledge of Corrective and Preventive Action management:
CAPA workflow, prioritisation, creating effective CAPAs, root cause analysis,
verification of effectiveness, and regulatory requirements.
—
Score
—
Percentage
—
Grade
CAPA WorkflowPrioritisationWriting CAPAsRoot Cause Analysis5 WhysVerificationClosureRegulatory Compliance
Personalise Your Certificate
Enter your name as it should appear on the certificate: