Introduction

 

 

Other products and services

 

CTL - Reference Page.

 

Biology Reference Page

 

Terms and Conditions

 

Home

 

 

What Is Proficiency Testing?

Proficiency testing (comparative testing) is an important way of meeting the requirements of ISO/IEC 17025 in the area of quality assurance of laboratory results. It is also mandated by accreditation bodies that laboratories participate in proficiency testing programs for all types of analyses undertaken in that laboratory, when suitable programs exist.

Proficiency testing involves a group of laboratories or analysts performing the same analyses on the same samples and comparing results. The key requirements of such comparisons are that the samples are homogenous and stable, and also that the set of samples analysed are appropriate to test and display similarities and differences in results.

The typical format of proficiency testing programs is that IFM issues a set of samples to each participant together with a set of instructions and any necessary background information. The participants then carry out the requested analyses in their normal manner and submit their results. The results are then statistically handled by IFM to generate a report. Each participant is confidentially provided with a report to allow them to compare their performance with the other participants. The performance of individual laboratories will only be known by that particular laboratory and a limited number of management personnel.

The handling of results is generally performed in a manner that compares each individual result with the consensus of the entire group. 

In the past the statistical handling of results was done by means of calculating averages and standard deviations. The current preferred method of data handling is via 'robust statistics', where the median result and inter-quartile range (range of the middle 50% of data) are taken to calculate the acceptable result range. This approach is thought to be a much fairer form of analysis than the classical style.

Benefits of Proficiency Tests to Company Group Leaders

  • By participating, the company as a whole will meet the requirements of ISO 17025 in the area of proficiency testing, from both inter-laboratory and intra-laboratory standpoints.

  • The company will have assurance of the good performance and capabilities of it's analytical staff.

  • The company will have information that can assist in future planning for equipment upgrades and staff training.

  • By encouraging participation in these programs, and acknowledging staff performance and cooperation, the company has a valuable opportunity to demonstrate its commitment to laboratory staff as an integral part in the production of quality products and services.

Benefits of Proficiency Tests to Individual Laboratories

  • By participating in programs, laboratory staff can gain confidence in their abilities, and knowledge of their capabilities.

  • Over a period of time, laboratory staff will gain satisfaction in the knowledge that they have improved or maintained a level of competence comparable with their colleagues.

  • They can detect any difficulties they may have with analyses, identify training needs and have a mechanism where these needs can be met.

  •  Overall, laboratory staff will have the knowledge that they are playing a valuable part amongst the company group to ensure product quality and safety, which boosts teamwork both in and beyond the laboratory environment.

 

Proficiency Testing and ISO 17025

 

Introduction

The philosophy behind the quality assurance section of ISO/IEC 17025 is to firstly ensure that a single analyst within a laboratory is able to consistently reproduce the same result on the same sample. Secondly, the result produced by this analyst should reflect the result that would have come from any other analyst in the laboratory. Thirdly, any results from the laboratory as a whole should reflect the results that are agreed upon by many other laboratories.

 

This is why the internal and external QC and QA clauses exist within ISO/IEC 17025. The precise way of going about proving consistency and reliability of analysis is not prescribed in ISO/IEC 17025. However, accreditation bodies have built some prescriptive clauses into their requirements to try to facilitate meeting the requirements of ISO/IEC17025 in an effective manner.

 

An externally provided proficiency testing program is a useful tool in meeting the requirements of ISO/IEC 17025, however, participating in an external proficiency testing program will not necessarily mean that all quality assurance aspects have been met.

 

The following text describes an overview of the usefulness of externally provided proficiency testing programs with respect to meeting the objectives described above:

 

Objective 1: Ensuring an analyst can reproduce the same result on the same sample.

Examples of ways to meet that objective:

  • Analyst can perform duplicate tests (from sample preparation stage) on the same sample at periodic intervals.

  • The results from duplicate tests can be assessed. (A clause in the laboratory quality manual indicates the acceptance criteria of duplicate test results.)

  • If the sample is known to be stable, the analyst can re-test the sample after a period of time to ensure they get a result within the same range.

Can this be done with an external proficiency testing program?

  • Yes, but it is just as important to check analyst reproducibility on a day to day basis.

 

Requirements:

  • Define a schedule where these things will be assessed and reviewed.

  • Define acceptance criteria for the results of observations.

  • Act / investigate any non-conformances.

 

Difficulties:

  • The analyst may have an expectation of the result they should obtain, and may be difficult to avoid performing a test to 'favour' the desired result.

 

Objective 2: Ensure that analysts within a laboratory would report the same result for the same test on the same sample.

Examples of ways to meet that objective:

  • Have the group of analysts test a larger quantity of the same sample at the same time and follow the test through to its conclusion. Such samples can be external PT samples, batches of material purchased from an external supplier with a certificate of analysis providing details of homogeneity testing and results, or samples made (or saved from testing work) in-house.

Can this be done with an external proficiency testing program?

  • Yes, but the sample volume available may not be sufficient to have every participant test the same sample from their own sample preparation step (usually a weigh out). It is possible to check consistency of diluting and identification from external PT program samples. It may be necessary to break up the testing to into separate tasks (for example sample preparation and weight-out, diluting the sample and plating, confirming the results). If this is done then, weigh-out and sample preparation can be compared on samples where sufficient volume of sample exists, and the counting/confirming steps can be performed on most if not all external proficiency testing program samples. If the sample volume is sufficient, however, it is not a problem to use samples from external programs for the whole task.

Requirements:

  • Define a schedule where these things will be assessed and reviewed.

  • Define acceptance criteria for the results and observations. There should be criteria for comparisons of internally generated data, and it is useful to look at consistency of internally generated data separate to an external PT report. If internally generated data is consistent, but does not agree with external PT results, it could be a sign that there is a systematic error/difficulty within the laboratory that is quite separate from the abilities of the staff. (This is also one of the reasons it is necessary to compare results internally.)

  • Act / investigate and non-conformances.

  • Note: It is not necessary to compare all of the analysts in a laboratory at the same time. It is important to ensure, however, that every analyst has been compared with at least one other analyst. If using external PT programs for this part of the requirement, it will be necessary to roster inter-staff comparisons to ensure everyone is included for every test over a period of time.

Difficulties:

  • If using samples made or selected in-house, there may be an expectation of the result.

  • Samples made in-house may yield inconsistent results for reasons other than operator practices. (For example, perhaps the distribution of the analyte is uneven or 'didn't take' when samples were spiked.) It may end up being necessary to perform large amounts of additional test work to establish the reasons for differences in results.

  • Staff may also feel 'under-pressure' to report the same result as their co-workers, and this may lead to collusion. Staff must be encouraged to provide a very objective opinion of their analysis, and this may be more successful if the atmosphere within the laboratory is nurturing.

 

Objective 3: Ensuring a laboratory report would reflect the results obtained by most other laboratories.

 

Examples of ways to meet that objective:

  • Participate in external proficiency testing schemes.

  • Organise inter-laboratory trials with other laboratories.

Can this be done with an external proficiency testing program?

  • Yes.

Requirements:

  • Define a schedule where these things will be assessed and reviewed.

  • Define acceptance criteria for the results of observations.

  • Act / investigate any non-conformances.

Difficulties:

  • It is difficult to find program providers that cover all the tests in the range performed by the laboratory.

 

Useful Links

Tutorial: Proficiency Testing

Understanding PT Statistics 

 

IFM Quality Services Pty Ltd

IFM, Working with you, for you

PO Box 877, Ingleburn 2565, AUSTRALIA

4/58 Stennett Road Ingleburn NSW 2565 Australia

Telephone:  +61 2 9618 3311

 Facsimile: +61 2 9618 3355

Email:  ContactIFM@ifmqs.com.au
                                                                            
Date of page update: June 08, 2012