ISO 15189:2022 Proficiency‑Testing & External Quality Assessment

ISO  151892022 Proficiency‑Testing & External Quality Assessment
Laboratory Accreditation

ISO 15189:2022 Proficiency‑Testing & External Quality Assessment

Last Updated on October 17, 2025 by Melissa Lazaro

Why Proficiency Testing Is the True Measure of Laboratory Confidence

Here’s something I’ve seen time and time again in accredited laboratories:
Everyone focuses on documents, equipment, and internal audits — but the real test of competence happens outside your lab.
That’s what Proficiency Testing (PT) and External Quality Assessment (EQA) are all about.

ISO 15189:2022 doesn’t just want your lab to follow procedures; it wants to know that your results hold up when compared with others doing the same tests.
That’s the ultimate proof of accuracy and reliability — because if your lab’s data stays consistent under independent evaluation, you’ve earned real credibility.

And let’s be honest — assessors pay close attention to this part.
They know that PT/EQA participation is where theory meets performance.
It reveals whether your quality management system works in practice, not just on paper.

In this guide, I’ll walk you through how to build a solid PT/EQA system under ISO 15189:2022 — from selecting the right provider, handling samples correctly, analyzing results, and dealing with unsatisfactory scores, all the way to integrating your EQA data into continuous improvement.

By the end, you’ll know exactly how to turn PT and EQA participation into one of your lab’s strongest sources of audit evidence.

Understanding ISO 15189:2022 Requirements for Proficiency Testing (Clause 7.3)

Before diving into the “how,” let’s get clear on what ISO 15189 actually requires when it comes to proficiency testing and external quality assessment.
Because this isn’t just about joining a program and ticking a box — it’s about proving, with evidence, that your lab consistently produces accurate, comparable, and reliable results.

What Proficiency Testing and EQA Really Mean

Under Clause 7.3, ISO 15189 defines proficiency testing (PT) as an external, independent evaluation of your lab’s performance.
It involves analyzing unknown samples provided by a recognized PT or EQA provider and comparing your results against other participating laboratories.

The purpose is simple: to confirm that your lab can deliver valid results — not just once, but every time.

EQA (External Quality Assessment) is the broader system that includes proficiency testing and sometimes inter-laboratory comparisons or external reviews of performance.

In short:
PT = the actual testing event.
EQA = the overall framework ensuring external verification of your competence.

What ISO 15189 Requires You to Do

To comply with Clause 7.3, your laboratory must:

  1. Participate in relevant PT/EQA programs for each type of examination when such programs exist.

  2. Follow documented policies defining participation frequency, performance criteria, and corrective-action requirements.

  3. Treat PT samples as routine samples — no special preparation, no shortcuts, no re-runs unless you’d normally repeat a patient test.

  4. Analyze, evaluate, and document PT results — especially if they’re unsatisfactory.

  5. Take corrective and preventive actions and verify their effectiveness.

Pro Tip:
Map each analytical method in your lab to at least one PT or EQA program.
That “PT Participation Matrix” is something every assessor will ask to see.

Why It Matters

Assessors use PT/EQA performance to judge your lab’s technical reliability.
Even one round of poor PT results — if handled poorly — can raise concerns about overall competence.
But if you handle them transparently, analyze root causes, and show improvements in the next round, you’ll earn serious credibility.

Common Mistake:
Labs often submit PT results without showing what they did with the feedback.
ISO 15189 wants you to close the loop — demonstrate action, not just participation.

The takeaway?
Clause 7.3 isn’t just a requirement; it’s your lab’s report card.
And when you manage it well, it tells assessors something powerful: your results don’t just meet the standard — they stand up to the world’s scrutiny.

ISO 15189:2022 Proficiency‑Testing & External Quality Assessment

Building Your Laboratory’s Proficiency-Testing Program

In my experience, the strongest labs don’t treat proficiency testing as a yearly obligation — they treat it like a core performance metric.
When your PT/EQA program is well-designed, it becomes your early-warning system. It tells you where variation creeps in long before patients or clients ever notice.

Here’s how to build a program that not only meets ISO 15189:2022 Clause 7.3 but also works for you year-round.

1. Choose the Right PT/EQA Providers

Start with recognition and credibility.
Only use providers that are accredited or recognized by ILAC-approved bodies or by your national accreditation authority.
If a program isn’t recognized, you’ll spend more time justifying its validity than benefiting from it.

Checklist when evaluating a provider:

  • Are they ISO 17043-accredited?

  • Do they offer the analytes or test types you perform?

  • Are reports clear about performance statistics (z-score, SDI, % bias)?

  • Is their schedule compatible with your workflow?

Pro Tip:
Create a “PT Provider Register.” Note the provider’s name, accreditation status, test scope, and contact details.
Auditors love seeing that you know exactly who evaluates you externally.

2. Plan Participation Frequency and Scope

Every test you perform should be linked to at least one proficiency program if available.
For high-volume or high-risk tests (e.g., glucose, creatinine, or hemoglobin), quarterly or biannual participation shows control.
For specialized tests where programs are rare, annual participation or inter-lab comparisons may suffice — as long as it’s justified and documented.

Common Mistake:
Joining one PT event a year for only a few analytes and assuming you’re covered. Assessors expect coverage for all major examinations.

3. Assign Roles and Responsibilities

Define who does what.
Here’s a simple model:

Role Responsibility
Section Head Oversees PT sample handling and testing.
QC Coordinator Tracks schedule, ensures timely submission.
Quality Manager Reviews reports, initiates CAPA when needed.

Having this table in your Quality Manual or procedure saves you from confusion when assessors ask, “Who reviews EQA results?”

4. Document Every Step

For each PT event, maintain a short but complete record including:

  • Provider name and accreditation number

  • Analyte(s) tested

  • Date samples received, tested, and results submitted

  • Analyst’s name

  • Evaluation summary and CAPA reference (if applicable)

Pro Tip:
Use a digital or Excel-based “PT Tracking Sheet.” It keeps everything in one place — easy for your team to update, easy for assessors to verify.

5. Build Your PT Calendar

Integrate your PT schedule into your annual quality plan.
Include upcoming rounds, expected result dates, and CAPA review meetings.
When PT deadlines are part of your QMS calendar, you never scramble last-minute again.

A structured PT program doesn’t just satisfy ISO 15189 — it builds confidence.
It shows assessors that your lab controls its accuracy, not just claims it.

Handling and Testing Proficiency Samples – Best Practices for Compliance

Here’s something most labs underestimate — proficiency testing isn’t just about submitting the right numbers.
Assessors look closely at how you handled those samples.
Because your testing behavior during a PT round reveals whether you treat external samples with the same discipline as real patient specimens.

The rule is simple but crucial:
PT samples must be handled exactly like routine samples — no exceptions.

1. Treat PT Samples Like Real Patient Specimens

Once PT samples arrive, they should enter your normal workflow — not a “special” one.
That means:

  • Register them into your LIS or logbook using your standard sample ID format.

  • Process them following the same SOP used for patient testing.

  • Apply the same QC checks, calibration routines, and reporting methods.

Pro Tip:
Never test PT samples outside normal working hours or with additional QC runs.
That signals “special treatment” and raises red flags for assessors.

2. Maintain Full Traceability

ISO 15189 assessors often ask: “Who handled this PT sample and when?”
Be ready to show that record instantly.

Document:

  • Sample receipt date and condition.

  • The person responsible for analysis.

  • Equipment ID and method used.

  • Time and date of testing.

Example:

PT Sample ID: EQAS-2025-02
Received: 3 June 2025, by M. Reyes
Tested: 4 June 2025 on Cobas c311, Method: Enzymatic
Reported: 5 June 2025 via EQA Portal

That level of traceability demonstrates control — and assessors love seeing it.

3. Submit Results Objectively and On Time

Late or adjusted submissions are one of the most common mistakes.
Once results are generated, report them exactly as obtained — no editing, no averaging, no re-runs unless your SOP explicitly allows repeat testing for routine work.

Common Mistake:
Comparing your PT results with other labs before submission.
That violates the purpose of proficiency testing and can be flagged as unethical behavior.

Pro Tip:
Keep a copy (screenshot or printout) of the exact results you submitted.
If a provider’s system ever flags an inconsistency, you’ll have proof.

4. Store PT Samples and Data Properly

Unless the provider specifies otherwise, keep PT samples (if stable) until results are released.
This allows re-analysis if discrepancies appear.
Also, store raw data, instrument printouts, and worksheets with your PT record for at least one audit cycle.

5. Conduct a Post-Testing Review

Once results are submitted, hold a quick review session.
Ask:

  • Were samples received and logged correctly?

  • Were testing conditions normal?

  • Were there any instrument or reagent issues?

That short discussion helps you spot potential problems before you even receive the evaluation report.

Pro Tip:
Add a “PT Sample Handling Log” in your EQA binder or digital tracker.
It keeps all events — from receipt to result submission — in one neat, auditable record.

When you treat PT samples with the same rigor as patient samples, you’re not just meeting ISO 15189 — you’re proving integrity.
Because consistency under observation is the purest form of competence.

Evaluating PT/EQA Results and Taking Corrective Actions

Here’s where proficiency testing moves from compliance to real quality improvement.
Submitting your results is only half the job — what matters most is how you interpret and act on the feedback.
ISO 15189:2022 expects your lab to demonstrate that you evaluate every PT report, identify causes of poor performance, and take measurable corrective actions.

Let’s walk through what that process looks like in practice.

1. Review the EQA Report Carefully

When you receive your EQA performance report, don’t just file it — dissect it.
Review parameters like:

  • z-score or SDI (Standard Deviation Index)

  • Bias (%)

  • Peer group mean and assigned value

  • Performance status (satisfactory / unsatisfactory)

Pro Tip:
Always print or download the provider’s summary and highlight your lab’s results versus target.
This makes trend reviews easier during internal audits or management review.

2. Identify Unsatisfactory or Questionable Results

If your score falls outside the acceptable range (usually |z| > 2 or SDI > 2), treat it as a non-conformity.
Don’t wait for the assessor to ask about it — log it immediately in your CAPA system.

Common Causes:

  • Instrument calibration drift

  • Expired or unstable reagents

  • Analyst transcription errors

  • Incorrect unit conversions

  • Sample mix-ups

Pro Tip:
Look beyond the outlier.
If your z-score is trending toward the limit over several rounds, it’s an early warning sign — even if technically still “satisfactory.”

3. Conduct Root-Cause Analysis

This is where many labs go wrong — they fix the number but not the system.
Treat poor EQA performance just like an internal audit finding.
Use structured RCA tools like:

  • 5 Whys (simple problems)

  • Fishbone Diagram (complex issues)

  • Process walkthroughs (hands-on observation)

Example:

Problem: Low EQA result for serum glucose.
Why? → Analyzer calibration offset.
Why? → New lot of calibrator not verified before use.
Root Cause: Lack of verification step for new reagent lots.

Now you know what to fix — not just the symptom, but the cause.

4. Implement Corrective Actions

Once the root cause is confirmed, document and execute your plan.
This may involve:

  • Re-calibration and verification of affected analyzers

  • Updating SOPs or adding lot-verification steps

  • Retraining analysts on sample handling or reporting procedures

  • Communicating lessons learned across all sections

Pro Tip:
Attach evidence to your CAPA record — calibration certificates, updated SOPs, and training attendance sheets.
Assessors often ask, “Show me proof this action was completed.”

5. Verify Effectiveness

A corrective action isn’t closed until you verify it worked.
That could mean:

  • Reviewing daily QC trends for stability

  • Monitoring the next PT/EQA round for improvement

  • Re-auditing the process after implementation

If the issue doesn’t recur, record your verification and close the CAPA with confidence.

6. Discuss and Document in Management Review

Summarize EQA findings in your next management review meeting:

  • Number of satisfactory and unsatisfactory results

  • Root causes identified

  • Actions taken and verified

  • Any trends or process improvements

Example:

“Two unsatisfactory PT results for creatinine led to improved lot-verification procedures.
Follow-up testing showed no recurrence in the next EQA cycle.”

That’s exactly the kind of evidence auditors expect.

When your lab reviews EQA results systematically and acts decisively, you prove something bigger than compliance — you prove competence.
And that’s what turns ISO 15189 into a living, breathing quality system instead of a binder on the shelf.

When No Proficiency Program Is Available – Alternatives and Justification

Here’s the reality — not every laboratory test has a suitable proficiency testing (PT) or external quality assessment (EQA) program available.
Specialized or low-volume tests often fall outside commercial schemes.
But ISO 15189:2022 doesn’t let you skip evaluation entirely — it expects you to find equivalent ways to prove ongoing competence.

This is where smart labs stand out. They don’t wait for programs to exist; they build structured, defensible alternatives.

1. What ISO 15189 Allows

Clause 7.3 is clear:

“Where proficiency testing or interlaboratory comparison is not available, the laboratory shall adopt alternative approaches to demonstrate the validity of examination results.”

That means you must still show that your testing process is accurate, reproducible, and traceable — even without an official EQA provider.

Pro Tip:
Always document why no program exists and what alternative method you’ve implemented.
Assessors expect a clear, written justification.

2. Acceptable Alternatives When PT/EQA Is Not Available

Here are recognized and widely accepted methods under ISO 15189:

a. Interlaboratory Comparison

Partner with another accredited lab performing the same test.
Exchange blinded samples and compare results.

  • Use at least 5–10 samples across the test’s measuring range.

  • Calculate bias and percentage difference.

  • Investigate and document any discrepancies.

Pro Tip:
Use this annually for rare or complex assays like drug monitoring or molecular tests.

b. Split-Sample Testing

If you serve a client or hospital with overlapping testing capacity, share actual patient samples for verification.
Compare both results and note analytical differences.

Example:
A regional immunology lab verifies its rare allergen assay by sending parallel samples to a national reference center once per quarter.

c. Use of Certified Reference Materials (CRMs)

When available, analyze certified reference materials and compare your measured values to the certified reference value.
Document calibration data, uncertainty, and any observed bias.

This works especially well for quantitative chemistry and trace analysis.

d. Retesting or Repeat Testing

For tests with stable samples (e.g., hematology or serology), retest retained specimens under different conditions, instruments, or operators.
If consistency is maintained, you’ve demonstrated reproducibility.

3. Documenting Your Justification

Assessors will ask, “How do you know your method remains valid without EQA?”
Here’s what to prepare:

  • A “PT Unavailability Register” listing the test, method, and reason for no PT.

  • Evidence of the alternative approach used.

  • Evaluation records showing comparable or consistent results.

  • Review frequency (usually annual).

Pro Tip:
Get written approval of this approach from your Quality Manager or Technical Director.
It shows the decision was reviewed and authorized — not improvised.

4. Incorporate Results into Your CAPA and Review System

Even alternative comparisons should feed back into your continual improvement cycle:

  • Record findings and actions in your CAPA tracker.

  • Discuss performance trends during management review.

  • Reassess the market annually to check if a new PT/EQA provider becomes available.

Common Mistake:
Using an alternative once and never redoing it.
ISO 15189 expects consistency — so repeat verification annually or when methods change.

When you document thoughtful, repeatable alternatives, assessors see a lab that takes competence seriously — even when no external scheme exists.
That level of control turns a potential weakness into proof of professionalism.

Integrating EQA Performance into Your Quality Management System

Proficiency testing and EQA results don’t belong in a drawer or a binder — they belong at the heart of your quality system.
ISO 15189:2022 expects you to use EQA outcomes not just for compliance, but as evidence of continual improvement.
In other words, your EQA data should tell a story about how your lab learns, adapts, and grows more reliable over time.

Here’s how to make that happen.

1. Trend and Analyze Your EQA Results

Don’t treat each EQA report as an isolated event.
Pull your results into a simple trend chart — by analyte, instrument, or department — and track performance over time.

Look for patterns:

  • Consistent positive or negative bias?

  • Certain analytes always near the limit of acceptability?

  • Gradual performance drift over multiple cycles?

Pro Tip:
Color-code trends (green for satisfactory, yellow for borderline, red for unsatisfactory).
A visual dashboard instantly tells assessors that you don’t just collect data — you use it.

2. Link EQA Data to Staff Competence

EQA isn’t only about instruments — it reflects how well your team executes daily testing.
If a poor EQA result ties back to a specific section or shift, review competency records.
Ask questions like:

  • Was the analyst trained on the latest procedure?

  • Did they handle calibrations or reagent lots properly?

  • Is refresher training needed?

Example:
A lab repeatedly saw high bias in potassium results. Root cause? One staff member skipped the pre-run QC verification.
They retrained everyone, verified performance, and saw consistent improvement in the next EQA cycle.

Pro Tip:
Document this connection — it’s powerful evidence of staff accountability and continuous learning.

3. Feed EQA Findings into Risk and CAPA Systems

Every unsatisfactory or borderline EQA result should flow directly into your CAPA and risk management system.

Here’s the flow:

  1. EQA result reviewed

  2. Issue logged in CAPA system

  3. Root cause analyzed

  4. Corrective action implemented and verified

  5. Risk register updated to reflect reduced likelihood.

This closed-loop process shows assessors that you’re not just reactive — you’re actively managing performance risks.

4. Integrate EQA Trends into Management Review

Your annual management review is the perfect time to showcase your EQA performance.
Summarize:

  • Number of EQA events completed

  • Success rate (% satisfactory)

  • Non-conformities and corrective actions

  • Improvements made since the previous review

Example:

“EQA bias in total cholesterol reduced from +7% to +2% following reagent lot verification implementation.”

That single line tells assessors your quality system is dynamic, not static.

Pro Tip:
Use graphs and summaries — not raw data — in management review presentations.
It helps leadership focus on improvement, not statistics.

5. Communicate EQA Lessons Across the Team

After each EQA round, share outcomes with your team — both successes and issues.
It builds awareness and prevents silos.

Hold short monthly “Quality Huddles” where you:

  • Highlight good performance.

  • Discuss what went wrong and how it was fixed.

  • Encourage staff to ask questions or suggest improvements.

When everyone sees EQA as part of their daily work, quality becomes part of the culture, not just an annual event.

6. Keep Records Organized and Ready

Create a dedicated EQA Performance Folder or Dashboard that includes:

  • All PT/EQA provider reports

  • CAPA records linked to poor results

  • Trend summaries

  • Management review notes

Having this ready during an audit makes a strong impression.
It tells assessors: “We don’t just store data — we manage it.”

When your EQA process feeds your training, CAPA, and management systems, it stops being a compliance exercise.
It becomes your lab’s performance compass — showing you where you’re strong, where you’re improving, and where to focus next.

That’s what ISO 15189 is really about: continuous confidence in every result you report.

FAQs – ISO 15189 Proficiency Testing & EQA

Over the years, I’ve heard the same questions from lab managers and quality officers right after an audit.
Most aren’t confused about what proficiency testing is — they’re unsure about how much is enough and what assessors really expect.
So here are straightforward answers, grounded in what actually happens during ISO 15189 assessments.

Q1. How many PT or EQA rounds do we need per year?

It depends on the analyte and availability of programs.
For most common tests, two to four rounds per year are ideal.
If you’re in a smaller or specialized lab where EQA isn’t offered frequently, once per year may be acceptable — as long as you justify it in your documented plan.

Pro Tip:
Create an annual EQA participation schedule.
List every test, its provider, and how many rounds you’ll complete.
Assessors love seeing a proactive plan instead of reactive participation.

Q2. Can we use a foreign EQA provider if no local program exists?

Yes, absolutely — ISO 15189 allows it.
You just need to confirm that the provider is accredited to ISO/IEC 17043 (or an equivalent national body).
If you use a foreign provider, document your rationale and ensure you understand their evaluation criteria and reporting format.

Example:
A Philippine diagnostic lab used UK NEQAS and RIQAS for clinical chemistry because no local schemes covered all their analytes.
They documented the justification and maintained recognition certificates — zero issues during accreditation.

Q3. What should we do if we get an unsatisfactory EQA result?

Treat it exactly like a non-conformity.
Investigate, perform root-cause analysis, take corrective action, and verify the result in the next EQA round.

Here’s a quick process you can follow:

  1. Record the finding in your CAPA tracker.

  2. Investigate possible sources — instrument, reagent, method, or operator.

  3. Correct the issue and document proof.

  4. Verify through follow-up analysis or QC trend review.

Pro Tip:
Never ignore or hide poor results — assessors want to see how you responded, not perfection.

Q4. Do assessors check our raw PT/EQA data or only the summary reports?

Both.
They’ll review the provider’s performance summary and ask to see your internal traceability records — the raw data, worksheets, and instrument printouts.
This shows whether your testing process and reporting align with your normal workflow.

Common Mistake:
Labs discard PT worksheets after result submission.
Keep everything — it’s part of your objective evidence of control.

Q5. What if no EQA program exists for some of our specialized tests?

That’s fine, as long as you can show an equivalent alternative (split-sample testing, inter-laboratory comparison, or use of certified reference materials).
You must also record:

  • The reason no EQA program exists.

  • The method you used to verify competence.

  • The date and results of your verification.

Pro Tip:
Update this justification annually — assessors will check that it’s still current and relevant.

Q6. Can poor EQA performance affect our accreditation status?

It can — but only if it’s mishandled.
One poor result won’t harm your accreditation, but failure to investigate, act, or document it can.
Accreditation bodies want to see control and improvement, not perfection.

When you handle EQA the right way — objectively, transparently, and systematically — it becomes one of your best assets during an audit.
It shows assessors your lab doesn’t just chase compliance; it proves competence.

Turn EQA Data into Continuous Improvement

At its core, proficiency testing and EQA aren’t just about passing or failing.
They’re about building trust — in your data, in your staff, and in your system.
ISO 15189:2022 treats EQA as the external proof that your lab delivers consistent, defensible results every single time.

When you embrace that mindset, every EQA round becomes less about compliance and more about confidence.

What Sets High-Performing Labs Apart

In my experience, the best labs do three things differently:

  1. They investigate, not justify.
    When results look off, they don’t argue with the numbers — they dig deeper until they understand why.

  2. They integrate, not isolate.
    EQA data feeds their internal audits, CAPA tracker, risk register, and training program.
    It’s all connected — nothing sits in a silo.

  3. They improve, not defend.
    Every PT finding, good or bad, becomes a learning opportunity.
    They turn feedback into a plan, not an excuse.

That’s how you transform EQA from a once-a-year requirement into a year-round quality engine.

Key Takeaways

  • EQA performance is one of the strongest indicators of laboratory reliability.

  • Each result — even the poor ones — offers a chance to improve your methods and training.

  • Trend analysis, CAPA linkage, and management review close the loop on continual improvement.

  • Transparency and documentation are what assessors value most — not perfection.

Your Next Step

If you’re ready to organize your EQA program efficiently, start with tools built by professionals who live and breathe ISO 15189.

[Download QSE Academy’s ISO 15189 Proficiency Testing & EQA Tracker Template]
It helps you record participation, link CAPAs, track trends, and prepare evidence for your next accreditation audit — all in one streamlined system.

A lab that learns from its EQA results is a lab that earns respect — from assessors, from clients, and most importantly, from itself.
Because true quality isn’t proven once a year during an audit.
It’s proven every day in every result you report.

Share on social media

Leave your thought here

Your email address will not be published. Required fields are marked *

Cart

October 2025
M T W T F S S
 12345
6789101112
13141516171819
20212223242526
2728293031  

Train Your Laboratory Team to Master ISO 15189:2022

Subscribe on YouTube

Resources

Related Products : 9001 for Labs

Related Product : ISO 15189 Kit

ISO 15189 Implementation Project Plan

Step-by-Step ISO 15189 2022 Implementation Plan—Perfect for Beginners

Kickstart your accreditation with a step-by-step project plan. Streamline the process, save time, and achieve compliance with ease

 

Your infomation will never be shared with any third party