Friday, March 30, 2012

Promoting Success (R)

Promoting Success
In my previous entry I wrote on our successful re-certification now running for 11 consecutive years.  We received a positive comment congratulating us on our success.  It was much appreciated.  But of interest, I received some emails that were more negative, the gist of which was that there is little point for self-promotion other than satisfying some personal need. 
With respect I have to strongly disagree. 
Reporting and promoting success is a critical part of the culture of quality.  Here are some of the reasons why:
  • Meeting certification or accreditation requirements does not come easily; it takes a lot of work and a lot of commitment from all your staff.  Letting people know of your success is one way of rewarding through recognition for their efforts. 
  • Promoting success is an important building block for group culture of quality.  Making others aware of your achievement reinforces the positives of working within your organization.
  • Quality Partners have an obligation to be seen to both “talk the talk” and also to “walk the walk”.  It is our demonstration to our partners and customers that we take Quality seriously.  If we can do it, then so can others.
  • Our customers, in this sense the hospitals and laboratories and physicians that work with us, need to be aware that laboratories do have standards and participate in inspection on a regular basis, and that when we are assessed we are found to be meeting our requirements and achieving success.  This is especially important for Quality Partners like Proficiency Testing providers and Accreditation Bodies.  All too often laboratories believe that we work to our own interests and set ourselves up as experts and authorities.  I learned this the hard way when I was publically scolded by a laboratory technologist for this very issue.  Indeed, it was this experience that made me realize that animosity for accreditors and proficiency testing suppliers and regulators runs very deep; we are seen with a lot of suspicion.  It is very important for laboratorians to know that just as we have expectations on them for success, others view us in the exact same way.  And more importantly we commit to the responsibility for those expectations. 
  • The people who make decisions about whether or not they should use our program need to be aware that we go the extra mile to ensure that we can provide value and that we are open and transparent when it comes to our own Quality.  They have confidence that they made the right decision to work with us as a Quality Provider. 
  • The public needs to know, especially in the environment in which we live today that we take Quality seriously.  Again this week failures in laboratory performance (radiologists in Quebec) were front and centre in the news.  The public is regularly become aware of our failures.  It is important for them to be aware of our successes.
I am surprised at how many organizations don’t recognize the importance of sharing with others their success.  Most private companies get it, and it is common to see logos and certificates on their advertisements, their trucks, there letterhead and their web-sites.  Promoting excellence has a business value.  I recognize that in North America the majority of laboratories are public sector rather than private and business is less a consideration. 

But confidence building and quality assurance have cachet.  Everyone from politicians, to regulators, to administrators understands the public value of confidence and quality assurance.  That is why there are institutions like the National Quality Institute and the Malcolm Baldrige National Quality Award.

So we intend to continue along our path to inform all our customers of our success through our newsletter, our website, and send-out materials.  We will raise awareness at every public speaking opportunity.  And we will continue to do this year after year.  And for those who consider promoting success as some sort of self-aggrandizement, or bragging, or manifestation of superiority, or some casting doubt on the quality and competence of our colleagues, they need to deal with it. 

Wednesday, March 28, 2012

A Happy and Successful re-Certification

Congratulations to us.  Our proficiency testing program as been internationally assessed and again our quality management program was recertified as in compliance with ISO 9001:2008.  This makes for 11 consecutive years.  For 10 of these 11 assessments (including this year) we were found to have no non-conformances.  There were two minor opportunities for improvement, both of which we were able to address on the spot.  We are pretty pleased with our accomplishment.  And when we last checked with our laboratory customers.  As mentioned previously near 90 percent of survey responders said that our meeting the standard demonstrated a commitment to quality and increased our credibility.  So that makes a win-win for us and our customers. 

In conversation, some suggest that the problem with our standard is that while it makes comment on our paperwork, it doesn’t prove that we are actually competent.  They say that a 9001 assessment does not speak to competency assessment.  They say that only an accreditation body can do make that judgement, to which I say “horse feathers!”.  Actually I usually say a lot more than that, but for purposes of keeping this web journal at an appropriate level of professionalism and decorum, I will leave it as “horse feathers”. 

We as comfortable with our recognition as are our customers. 

I know there is another standard for proficiency testing programs entitled ISO 17043:2010 Conformity assessment -- General requirements for proficiency testing.  It is an excellent  standard, one for which I participated in the development.  And one day we will probably look at it.   The document shares with 9001 the intent to help assure other organizations that the assessed program is competent with Quality being the prime objective.  That is a laudable goal.   My concern is not in either document, but is in the people doing the assessment.  I don’t have a lot of confidence that many assessment bodies have either the knowledge or experience to do a proper accreditation audit.  

The body that audits us every year for the last 11 years knows us, knows what we do, and knows our strengths and areas in which we can improve.   We have gone through this experience together and have mutually learned during the past decade.  We have worked with three different assessors from the company, and while each was a different person with individual personality and traits, over the years and we have found a consistency in their approach to our assessment.  We understand that we are not their only client and that they are not our coach or mentor.  But we understand them and they seem to understand our operation. 

Certification bodies must have  hundreds if not thousands of organizations wanting to be assessed to ISO 9001:2008, and they all get tons of opportunity and experience.  They know and understand the conventions of standard interpretation.   They understand and appreciate the subtleties and nuances.  They also know how to put context to the document.  They know it, they eat and breathe it.  Put simply, they know what they are doing, and they do it well.

On the other hand, there are very few proficiency testing programs in the world, and there are likely far more accreditation bodies than there are proficiency testing providers.  It is doubtful that any single accreditation body has done  more than one or two (maybe 3) assessments to ISO17043.  It is doubtful that many of them have had the opportunity to learn and understand the nuances of the requirements of the standard, and for certain they are not in the position to appreciate the differences or subtleties of different organizations.  Getting assessed in that sort of situation increases the risk of individual interpretation, and that in my mind is a lot at risk, indeed too much risk for too little reward.  Frankly I would rather let them learn on other people and hone some understanding before letting then come to us.

Its kind of like having a choice between a surgeon who done a procedure a hundred times, one that read about it.

In a few years, when the standard has been around for a while, and more people know what to look for, then we will consider the additional recognition. 

But not now.

Today we are very happy and very proud to be a part of a very small minority of programs that have gone through 11 successive external quality assessments and come out of it with honour and recognition. 

And as best as I can tell, so are our customers.

Friday, March 23, 2012

The Engaged Patient

While on my travels back to Canada I found the editorial page that I removed from the Globe and Mail from March 5.  The editorial was entitled the “engaged patient”. 
Maybe the editor from the Globe is a reader of Making Medical Lab Quality Relevant because I have been on this theme for the last while.

One of the adages from our Quality Management course is that of all the Quality Partners, the singlemost important driver of quality is the Public, because once the public is engaged, politicians have to listen.  We may not necessarily like what happens next, but at that point it probably is fair to say that whatever happens, we brought in on ourselves.  

 I think the choice of words for the title of the editorial, the engaged patient, was interesting.  By the change of one letter it becomes the enraged patient, and all to often one leads to the other rather quickly. 

If I have issue with the editorial, it is the choice of the issues that they highlight; parking spots, and hospital food, and of course wait times.  If these are what most Canadians are complaining about, then we are running a pretty good medical program.  Or they are even  more in the dark than I imagine.

There are some real issues and the laboratory is often front and centre because that is where most of the diagnostic information comes from.  

For years we have bemoaned the quality and interpretability of our reports.  We have blamed the absence of simple and plain language on the Laboratory Information System (LIS) software.  If that is the case then it is time that we do something about it.  Creating reports that many physicians can’t understand is intolerable.  Providing these reports to patients is inexcusable.  

Providing tests that are fast and easy to perform, but which are insensitive and non-specific is not acceptable.  Many of the rapid tests are about 90 percent sensitive and have a specificity at about the same level  Sounds good, but when these tests are routinely overused on patients with a low likelihood of having disease, the value of the test drops down to about the same as a flip of a coin.
When we allow laboratories to hide their incompetence by “cheating” on their quality assessment testing, or when accreditation assessments are a 2 hour visit rather than a real inspection, confidence in their performance would drop if patients and physicians were to find out.

And we have already seen what happens when breast and other cancer testing is not done with precision.
Going back to the editorial, one of the solutions that is correctly identified is COMMUNICATION.  And they are right ... almost.  I was amused when they suggest that medical organizations should be at least as good at giving information as the airline companies.  “When a plane is late, passengers are told why”.  Give me a break.  One thing you can say with almost total certainty is that the airplane information is more about mushroom farming than information sharing; you cover the mushrooms with dung and keep them in the dark.

But communication is a key answer.  Get the well informed patient advocate groups engaged now.  They will help use get rid of the “fast and easy and wrong” tests, and take Quality seriously.  And we will learn to how to use plain language.
Our future depends upon it.

Tuesday, March 20, 2012

The Power of Quality Assessment

I had a wonderful experience today.  I am in southern Africa working on a consultant basis with a proficiency testing program.  So today we did a visit to a rural District hospital laboratory.  The term district means that while being in a rural area and small, it acts much as a regional center.  While it is a full service clinical laboratory (providing chemistry, microbiology and hematology services) it lacks certain fundamentals meaning that it has internet irregularly, no land-line telephones, and relatively unstable electricity.  This makes things tough to develop a modern scale communication approach to sharing information for diagnostic care.  With the exception for the universal access to personal cellular phones communication remains as paper and local ground transportation based. 

The reason that I was there was to gather some information on the laboratory staff’s knowledge and opinions of Quality systems and proficiency testing.  And what I saw assured me that I had made a right career decision when I decided to focus my attention and whatever talent and skills to implementing Quality.

The first thing that was obvious was the cleanliness and organization of the laboratory.  A near perfect example of Lean philosophy (neat, sorted, everything in its place).  Paper based accessioning logs again neat, organized, and up-to-date, and traceable.  Well written SOP’s (again manually generated).  Not a lot of books readily available, but clearly the laboratory staff here have things put together very well.

When we talked about proficiency testing and their PT provider, their response was that they see PT as their essential Quality check.  They receive samples on a regular quarterly basis with some samples on a monthly basis, and they get their results reported back to them via paper that is transported by local courier.  

They don’t have the resources to overwork samples, but they had already decided to treat their samples the same as clinical sample: a standard basic work-up and that is what is reported.  They integrate the PT samples into their routine work on the day of receipt and test them the same day.   Whoever is “on”, includes the PT samples with the routine samples and just work through the whole group together.  There is no second guessing, no tricks, no special testing.  “We get what we get and we that’s what we report.
And here is the cherry on top of the sundae.  When the PT results come back, the group of technologists get together in the little room at the back, and review the results together.  They talk about what they got right and what needs fixing, and then they all sign of the PT report with their signatures.  

No tricks, no gaming, no cynicism.  A group of laboratorians working in an isolated environment doing very good laboratory work in a simple and straightforward professional way, and grateful for the support that they get for the PT program. 

This experience supports my own experience from my own program in Canada.  When we do our satisfaction surveys, large academic centers (we call them Category A laboratories) find our program interesting but say we provide little  educational material that is new to them.  They get most of their new knowledge from journals and conferences and workshops and from their own research.  Fair enough.  But for the smaller facilities that don’t have their own researchers on staff or don’t have the time or resources to participate in the other elements mentioned, they find our educational materials as essential and describe us as their most useful source of continuing professional information.

It is enriching when you see smaller laboratories in isolated regions doing well on despite the additional challenges they deal with on a daily basis.  It means that the patients in these regions have the opportunity to receive care based on quality information.  There are many gaps that exist in healthcare between the resource limited regions and resource enriched.  

 Proficiency Testing provides a way to make the gap a little smaller. 

Wednesday, March 14, 2012

Nurturing Future Quality Managers

I was reading in Canadian Government Executive an article under the banner Leadership, entitled "Nurturing Future Policymakers" by Andrew Snook.  It was referencing a program called the Policy Analyst Recruitment and Development Program (PARDP) which invites and screens applicants to identify a small number of recruits who are then trained and mentored for up to a year and a half.  Once graduated, most stay on within the ministry as part of a new cadre of Policy Analysts.   
This seems to me as a brilliant idea.  There are lots of folks that want to work in the civil service, and some of them may have some understanding that the driver of action is Policy, but the only thing worse than the absence of policy is the presence of poorly thought out or unbalanced Policy.  The problem is that few in the service know anything about what to do and how to go about it.  So create an interest, develop a pool of appropriate candidates, and then train them and recruit them.  Fill the niche and problem solved.

I would guess that Policy Analysts and Quality Managers have some things in common.  First off, I doubt there are many (if any) people who come out of high school or even from a first degree position with thoughts and plans about a career in either.  Both of these fields are acquired tastes that develop after a few years of working within their field.  These are not the fields with really wide general appeal, but rather draw from a group of experienced and knowledgeable workers who enjoy big picture issues but with a focus for detail. 

Traditionally most people have come to quality in the second half of their career, having spent the first 10-15 years doing routine benchwork or traditional supervision.  Most developed their Quality working base from past notions, first principles and self-directed reading.  

Today, many come to Quality much earlier in their careers, perhaps having recognized that career longevity can be enhanced by taking on new challenges.   

From my experience most healthcare organizations don’t actually train and mentor the new Quality recruits, in large part because while there is interest and awareness of the importance and potential, there is rarely anyone in the organization who is sufficiently knowledgeable.  The consequence of this lack direction and focus is that the opportunities to build a Quality presence are made much more difficult.  All too often we see Quality initiatives flail around at the periphery of the issues until they gradually fade away.  To date, while there have been some Quality successes, they are still found in a very select group of facilities and most programs fail to progress. 

That is where programs like my Certificate course in Laboratory Quality Management has found a niche.  By covering a broad survey of Quality information in an environment that encourages conversation, application and collegiality we can help people develop a better foundation, and importantly a community of colleagues and mentors with whom they can collaborate.  We make laboratory Quality have a greater potential for success because we develop managers with a stronger foundation. 

While at the BC Quality Forum last week, I heard about a new program developed by the BC Patient Safety and Quality Council called the Quality Academy.  It looks like another excellent education program delivered to improve Quality knowledge, and bother delivered in British Columbia.

The Quality Academy is a 5 month program focused mainly for clinical institutional healthcare personnel, that started in 2010 (I think) and organized in a number of modules each addressing a different aspect of Quality.  People are proposed by their health authority but selected for recruitment by the Academy.  About 30 people are selected for each session.  Its curriculum is almost identical to my program.  It is largely a discussion and reading based program that works with a small group of students at a time.   It has a broad based faculty.  

It is so similar to the structure of our program (even the tuitions are the same) it almost looks like it is a product of “broadly borrowing and stealing shamelessly” which in many regards is a form of validation and a type of compliment.

So how does this relate to the PARDP program? Here we have three independent programs all education based, mentor and peer driven to prepare people for a new career in an important but niche area that address concepts of Policy-through-Quality or Quality-through-Policy.  All focus on adult education for mature learners.  

Welcome to the new world of program improvement.

Sunday, March 11, 2012

BC Quality Forum 2012

I had the opportunity to attend an interesting conference earlier this week.  It was the British Columbia Quality Forum, hosted and supported by the British Columbia Patient Safety and Quality Council.  It was an interesting meeting with lots of positives, and maybe, as we say in the Quality arena some room for improvement.

We can start with the positives.  This was primarily a local crowd from my home province.  There must have been the better part of 500 people attending.  When you consider that in British Columbia in total there are about 4 million people and in clinical health care there are maybe 50 thousand workers, having a group of 500 come out to a 2-day conference on Quality represents a pretty substantial level of interest..

Most of the attendees were nurses or hospital or ministry related bureaucrats and administrators.  A few physicians.  I think I was the only person from the laboratory world.  So you can see there was a strongly clinical bent in the crowd. 

Watching what happens as a new movement starts to catch a lot of attention and enthusiasm can be a lot like watching a novice rider sitting atop a galloping horse; arms and legs all akimbo, holding on to the reins, trying to stay in control but very much just trying to stay atop.  There was a large scatter of projects, most of them (in my opinion) basic and trivial, but with all sorts of enthusiasm and a far too much jargon talk about “PDSAing” and Kaizen and Just Culture.  The over-riding sense is lots of trial and lots more error; all the finger prints of faddism.

One presentation stands out as the poster child of enthusiasm without substance.  A pediatric hospital saw a need to develop a new pathway to alert the team when a child was seen to be taking a “turn for the worse”.  That’s a good idea.  So after many versions and revisions (PDSAing) the document was released for use.  After two years they looked at the number of bad events there had been and saw a substantial drop the first year, but a return the next.  They would like to point to the use of their protocol as the base for success, but as they pointed out, many staff didn’t like to use the document, most of the new staff were not trained on how to use it, and none of the doctors knew anything about it.  So having learned this what had they done about it?  Well not much, but they had plans.  So while there was some good intention and lots of audience and self-congratulation on a task so well done, when it came down to the crunch, the real point of the exercise had been missed.  Having recognized a need, there had been a Plan to create a solution, and an implementation (Do) and an examination on how it was working (Study) but no follow through; all this revision and revising of the form, but no tangible steps to implement improvement. 

If that sounds negative and pessimistic and petty, that would be a poor and unfair choice of words on my part because on balance I have more cautionary enthusiasm than discouragement.  Having been involved in a LOT of conference at every level from institutional to international, I have seen many weak presentations.  From that perspective, this presentation is a fairly typical example of a neophyte group grappling to experiment with something new.  

I have been working in the Quality arena now for near 30 years, and it is great to see that clinically oriented healthcare is both aware and constructively starting to catch on.  If I have a concern, it would be that superficial dappling that uses up TEEM resources without tangible will result in disappointment and if it becomes a flash it will quickly become a burned-out fad.

Memo to self: By its history, development and quantitative nature,  the medical laboratory is much further along the path to Quality implementation than clinical service, but typical of the introvert nature of laboratorians there was virtually no presence.  It is consistent with the absence of laboratory presentations at the annual conference on resident education.  Not participating in these Education and Quality forums is a poor decision.  If you don’t get engaged in the community, you can’t be disappoint when it doesn’t appreciate your accomplishments.).  

Next year, if the BC Patient Safety and Quality Council decides to put on a forum again, I am going to ensure there are at least 10 laboratory presentations.  My group will do four. 

Now all I have to do is convince some of my colleagues to submit the other six.

Tuesday, March 6, 2012

Quality, Cost and Pay-for-Performance

I was reading an article in the Ottawa Citizen newspaper today about the challenges that are coming to Canada’s largest province with respect to health care funding.  Money is running out, new funding formulae are being put into place with a very prominent role for “pay-for-performance”.  For those that don’t know the term it means that those organizations that demonstrate themselves to be more productive and more effective get more funding and support than those that continue to be less productive and less effective.

For everyone and anyone interested in Quality and Cost both inside and outside the laboratory your moment has arrived.  Those that let this slide by are missing out the opportunity to shine the light on Costs of Poor Quality and to point out how many unbudgeted dollars can be recovered.

A few thoughts. 

  • We know there are many errors that occur on a regular basis, more of them seemingly trivial and without consequence but some that lead to inconvenience and other that go very far bad.  The problem is that we don’t know which is which in the beginning so all of them had to be addressed.  That results in significant TEEM costs.  Data entry errors consume about 30 minutes to fix. Analyzer errors consume 100s of hours.  On average errors cost 130 minutes per, and can impact 10-30% for staff.  In a large laboratory with 200 employees that can represent 20 salaries every day, all of which is non-budgeted time loss.  Even a  modest reduction in error can reduce consumption of wasted dollars.  Adding a quality team that can focus on error reduction will pay for itself through savings accrued.

  • We know that even appropriate clinical requests lose their value when they are collected at the wrong time and in the wrong way.  Many result in errant or uninterpretable or worse wrongly interpreted results.  The more tests that get ordered, the worse it gets because ordering more just makes more people busier and busier people make more shortcuts and more mistakes.  Quality teams can fix this too.  A few years ago when the topic of clinical technologist positions arose it was seen as a luxury; unlikely to have the same impact as clinical pharmacists who roamed the wards reviewing pharmacy orders before they got filled.   But if a clinical technologist program prevented only 4 or 5 errors per day, they would pay for themselves through reduced time requirements.  Changing a technologists work function from front-line bench to front-line ward would add variety to their position, and may even help clinical staff and patients better understand the subtleties of test results.  I know the wards will be filled with moans and groans and grief.  “That’s a REALLY bad idea because that’s not their job or their knowledge and they are intruding on our domain”.   That’s what they said about clinical pharmacists too, and the clinical pharmacists proved to be a pretty valuable asset.

  • And here is a third idea.  I raise this with caution, but I am pretty confident that I am right.  Time to review and rethink regulations that add to cost but addresses only theoretical risk.  We have introduced so many requirements that are based on ZERO tolerance for risk, when most in our society acknowledge that some level of risk is OK.  I will give two examples that we should emulate.  In Canada, we allow transport of samples such as stool for occult blood to be transported without being considered as dangerous goods, even though it might contain Salmonella or C. difficile.  Similarly we are comfortable with the notion that if we don’t know that a blood sample contains HIV or Hepatitis B, then it probably does not.  These are good examples of recognizing and tolerating a level of  acceptable risk.  So if that is OK, and I support Transport Canada’s level of administrative bravery on that level, why do we then put huge financial burdens on other situations when true risk is even lower.  How much could our society save if we predicated decisions on rational risk as opposed to theoretical risk.  Examples of sending isolates from one laboratory to another for better identification, or sending blood samples for Syphilis serology represent the same level of trivial risk but result in Dangerous Goods charges.  Or the transport of Quality Assurance materials.  I suspect with a moments thought, you can probably come up with a whole slew of other regulations that make little sense, but cost many dollars. 

So here is my bottom line.  Health is a massive bundle of costs, with far too much waste.  It is not longer appropriate or acceptable to make decisions on “well what you want for your grandmother” or “to hell with cost…my job is to cover my backside.” The opportunities in reducing costs through actively addressing the costs of poor quality and poor decision making are too great to be ignored.

Bob Dylan said “the times they are a’changing”.  Here’s hoping he was right on for right now.