Friday, April 18, 2014
Fishing and Proficiency Testing
In our UBC Certificate Course in Laboratory Quality Management we introduced a concept which we call Quality Partners. These are agencies and organizations whose primary purpose is to enhance the quality of medical laboratories. Without these partners, the likelihood of a medical laboratory developing any semblance of quality enhancement and error reduction for all intents and purposes, negligible. The primary Quality Partners are Accreditation, Proficiency Testing providers, Standards Development organizations, Education providers, Professional Organizations, and Suppliers. While I am biased, of this group, the only partner that has ever demonstrated the ability to actually reduce errors is the Proficiency Testing provider group.
When PT exists with good and competent programs, laboratories receive samples that closely resemble actual samples, with the singular but essential difference being that the sample has been pre-tested and stabilized so that the correct result is known by the provider. The challenge to the laboratory is to test the sample and come to the same value. If the laboratory can do that they are told they are proficient; if they cannot they are told they have opportunities for improvement to learn how and why they made mistakes, and to determine how they can avoid such errors going forward.
PT providers are a committed group of people, dedicated to the premise that helping laboratories overcome errors directly contributes to patient care and patient safety.
While my own personal focus in on water testing and patient care microbiology sample testing, most developed countries have PT providers not only for medical laboratory testing and water, they also test for food and engineering, and many many other types of laboratory tests.
Unfortunately, developing countries see the value of developed medical laboratory testing, and see the value of quality partners, but they do not have local access to PT. And that is a problem.
For many, the solution is to purchase PT samples from a larger country, who is always more than happy to provide them. That the samples were never designed to address the issues of developing countries and that to get the samples to the laboratories at considerable distance they have to compromise their quality by freeze-drying the samples is of little concern. When they write the informative sheets, they are written from the perspective of the large parent country and without any perspective for the developing region.
So the product they receive is costly, not designed for their purpose, not optimised for their purpose, and not assessed or graded for their purpose. That does not seem to be much of a deal, and from what I have seen, the success rates on the samples is rarely barely close to acceptable.
We have taken a different approach.
Rather than just shipping samples, we bring one (or preferably two) able persons to our centre and give them intensive training in producing samples and setting up a basic program that will allow them to start a program, select, produce, transport samples relevant to their setting, and show them how to grade them and produce a critique. As mentioned it is an intensive program. Some get the information they want in one session, others come back for a second session.
Over the duration of our program we have usually had one country participate each year. This year we will have had representatives from 3 countries. And we are getting contacts and expressions of interest for next year. So by an awareness scale we seem to be having success.
This summer we will be doing a survey to see if the countries that have visited us have implemented programs and if those programs are still in operation. (Satisfaction and Loyalty).
The original version was “…give a man a fish he is hungry again in an hour; if you teach him to catch a fish you do him a good turn." This was written originally by a British author, Anne Isabella Ritchie and published within a story entitled Mrs. Dymond, in an American magazine Littell's Living Age printed in September 1885.
Personally I prefer the original version.
PS: If you want to know more contact us through www.cmpt.ca
Sunday, April 6, 2014
Saturday, April 5, 2014
Learning about Risk and Prevention through Continual Improvement
Yesterday we had our 12th external audit by our ISO certification body in 12 years. You would think that by this time we would have the process down pat. It should be a walk in the park. Well this is sort of true. In the early days we would look at audit day with a certain degree of anxiety and distress. These days those negative concerns are gone, and in fact we look forward to the exercise because we still find opportunities for learning and opportunities for improvement.
Some would say that now that ISO 17043 (Conformity assessment — General requirements for proficiency testing) exists there is no longer a place for assessment of proficiency testing programs to ISO 9001. I can see the argument since ’43 is more specific, but to my mind there is no document more specific and exacting to quality management than 9001. So while we will likely seek an external auditor that we have confidence can do a competent assessment to ’43, we will continue to stick with 9001. Indeed, to my way of thinking, I suspect that when the smoke clears, we will likely find a way to do both.
One thing that I appreciate about our current ISO 9001 assessment body (SAI Global) is that they provide us with a knowledgeable and experienced auditor with a lot of knowledge and skill in the laboratory area. Several of his clients are in the laboratory arena. He is knowledgeable about calibration and equipment and quality control, and our assessment covers these areas thoroughly. In addition to our regular close look at our quality system and management review and customer satisfaction plus plus, we also get a solid review of critical laboratory expectations. So in a sense we feel we currently are getting the best of both worlds right now.
So how did we do. We were found to have no non-conformities. We continue to meet all the requirements. But we are not perfect. Although we do two major internal audits each year, (one in February-March and one in July-August) and had the documentation available, when I wrote up my Management Review report, I forgot to add in an “Internal Audit” section. We probably could have gotten away with that except we publish the MR review in our Annual Report (available on-line at www.CMPT.ca) and so the absence of an IA section was pretty obvious. This was written as an “OFI-Opportunity For Improvement” rather than as a non-conformity. I could go back and edit the MR review, but we agreed that the more appropriate approach would be to include the section in our next MR in our next Annual Report.
The other thing that came up is a common problem, but I think we have found a good solution. Like most organizations we are weak when it comes to having a documented list of Preventive Actions in our report. It looks like we continue to be only re-active to problems rather than pro-active through prevention. But in fact that is not true. Our problem is one of recognition and value.
We make changes and improvements all the time. We changed our glass slide labels to a better material so that they would not fall off and not get covered with stain. We changed our packaging to reduce the risks of leakage to a level beyond requirements. We implemented our earthquake preparedness safety kit, and revised the labelling on some of our reagent bottles. There have been probably 15-20 of these types of changes going on over the last year or two, and while we have thought of them as products improvements and safety requirements (which they are) they are also true Preventive Actions because each of them reduces a Risk of inconvenience to ourselves or our customers.
So now as we continue along our path of innovation and continual improvement we can also look at ourselves from a Risk and Preventive Action perspective. And if that was all we learned from this year’s audit, it was well worth the cost.
As a PS note, some explanation may be necessary. If we are so satisfied with our current assessment process, why would be even be thinking about including additional assessments? The answer is that for some of our customers (clients) seeing the words “Accredited to ISO 17043” is becoming more important, and so I suspect our move to accommodate their concerns will come to fruition fairly quickly.
Thursday, April 3, 2014
Over the years I have organized 22 Quality Seminars for what used to be called the Conjoint Meeting, an annual event in Canada that has brought all the interests of microbiology, infectious diseases, and public health together for the last 80 years or so. Today it is called the AMMI Canada CACMID conference, named for the two principal host organizations.
The Quality Seminars have changed their name and purpose since their inception in 1992. At first it was an opportunity for laboratory accreditation bodies and proficiency testing bodies that got together and discuss common interests. When the accreditation bodies dropped out the proficiency testing groups continued on and we could chat about interesting PT issues. Over time the focus started to move again more in the direction of standards development, and then finally very much into the areas of individual interest, such as international activities, education, and finally a whole slew of Quality oriented topics, that impacted mainly microbiology laboratories, but eventually the broader topics or error and culture and continual improvement.
The Quality Seminar has always been seen as separate from the main meeting. At the beginning it was an event that people could attend on the day after the conference was over, especially if they had a late flight on the next day. Then it became one of the day-before events so that people would have something to do when they got to town a day early. Both of these times worked out well because it was not competing with the chaos of the full meeting, and gave us lots of time as opposed to being limited to a slot of 45 to 90 minutes. While others were being shoehorned in, we always had at least 4 hours to fill, enough for at least 4-5 speakers and discussion and coffee breaks.
The audience was always a select special group. Even in the arena of laboratory and clinical medicine, the audience for Quality issues has never been huge. But it was a faithful group and every year, pretty much we could fill a small or intermediate sized room for the whole session.
Of interest, as the conference expanded, other groups began to see the day-before block as a desirable time slot and we found ourselves competing with other groups during our afternoon block, probably to the disadvantage of both. The people having the competing time meeting locked up all access to the physicians-in-training (aka Residents), a group of people that prior to this change would benefit from being at the Quality Seminar and being exposed to a topic for which their routine training was (and still is) somewhere between inadequate and absent. And we would have enjoyed having the Residents with us, because it would have increased the group size and expanded the group discussion. But it just was not to be.
So now after 22 years, I think it is time for me to call it quits. I have kind of run the gambit of topics and I am not sure I see the value to me in continuing on. That does not mean that Quality Seminar will end, indeed I hope that is not the case; it just means that someone else will need to take on the responsibility to organize it and lead it and keep it afloat and move it towards its next iteration.
For those interested, the presentations from the meeting will be on-line at www.POLQM.ca under the title AMMI-Canada CACMID 2014.
Wednesday, April 2, 2014
Give this a try. Created for our Certificate Course in Laboratory Quality Management
If anyone gives it a try I will post the key in a few days
If anyone gives it a try I will post the key in a few days
Saturday, March 22, 2014
The problems with research
It’s always interesting when things come together.
Over the last few days I was at an international meeting where the discussion about the potential value of a Best Practice Guideline document for Researchers was raised because of a general experience that too many research dollars are wasted on non-reproducible and irrelevant studies. An interesting proposal I thought, but perhaps a tad overstated.
Then I get to the airport and pick up the Economist (March 15-21 2014) and turn to the Science and Technology section and there is an article called Metaphysicians which mentions John Ionnidis ( Why Most Published Research Findings are False ). I have discussed this author and this paper before [see: http://www.medicallaboratoryquality.com/2012/12/quality-and-medical-research.html ] because of his interest in doing “research on research”. According to the article, a scourge of modern science is the “waste of effort”. In 2010 “$200 BILLION (85% of total expenditure on medical research) was squandered on studies that were flawed in their design, redundant, never published, or poorly reported”. Assuming those numbers are true, that would certain confirm there is a clear need for help.
According to Ionnidis too many researchers staggeringly over interpret statistical significance in studies that are far too small. Further they have a lack of insight in proper study design and manifest “publication bias” where positive data gets written up for presentation and negative data gets ignored or worse.
Over the years we have observed that graduate students working in research laboratories seeming lack knowledge and interest and respect in running essential common equipment such as autoclaves. Commonly the problem is that they are required to use the equipment, but did not receive appropriate training. In essence they were just pushing buttons. ( I suspect in the minds of their Principle Investigators, autoclaves are mundane compared to DNA analysis.)
Furthermore common procedures of quality control and quality assurance are often incomplete or inadequate and generally not understood.
What they do not seem to understand is that in the absence of basic Quality principles NO research can or should be trusted.
Apparently Ionnidis has doing something about it. He has opened up a Center for meta research with the name Meta-Research Innovation Center at Stanford and the even more appealing acronym METRICS. That will help to define the problems and perhaps develop some answers.
Here are some questions that I would like to see addressed:
A. Would increased training in laboratory Quality result in reduced non-reproducible research and increased value for dollar spent?
B. Could accreditation of research laboratories result in reduced non-reproducible research and increased value for dollar spent?
C. Would introduction of proficiency testing into research laboratories result in reduced non-reproducible research and increased value for dollar spent?
I was intrigued by the notion of a Best Practices in Research Guideline however I also understand that in the current environment such a document would be pursued by those few laboratories that recognize the concepts of standardization and continual improvement. The vast majority would, I suspect, never become aware of its existence, or never purchase it, or never read it, or read it but never consider it as relevant or appropriate to their laboratory.
I know this sounds a tad cynical, but for how long have we taken that approach in the health sciences. “It is not our problem, we are too smart, if it is a guideline then we can ignore it and if it is a regulation we can obscure it.”
This may well be the time when far more aggressive research oversight becomes a reality. If a highly qualified author writes something in Accreditation and Quality Assurance (an excellent journal) its impact is strictly limited. But if articles end up in the Economist, that is a different story. Business folks read the Economist, as do lawyers, and politicians, and University presidents, and many of the general public.
Sooner or later, the right (wrong?) person is going to start putting 2 and 2 together; wasting $200,000,000,000 is an insult to the public purse. And then the regulations start happening, and some people lose their jobs, and some people end up in jail.
Even if we spend $2B on setting up a stringent oversight we are way ahead of the game. And the impact on jobs would be negligible because for every research laboratory that is shut down, some would likely move into consultancy or oversight.
Seems like a win.