Wednesday, December 28, 2011

A different view of RISK

Previously I mentioned Management Mythbuster by David Axson as a great read.  I still support that.  But it is important to say that David Axson is not a laboratorian and he did not write the book as a guide to medical laboratory quality.  So that creates a challenge; how to find useful information that has applicability across the broad spectrum.

Take the topic of risk. 
Medical laboratories carry all sorts of risk.  Although Axson does not classify risks in this way, I can, by extension from the chapter, put together the following:

  • Patient risk: for example when a test is easy to perform and is repeatable (verifiable) but the test results have no discrimination value to document or diagnose pathology or wellness (not validated). This would also apply to the opposite (validated but unverifiable, or insensitive or non-specific).
  • Innovation risk: if a new piece of equipment obtained to perform the test proves to be unstable or unreliable or inefficient.  (Another version of innovation risk is when you sink a lot of money into a piece of equipment or assay only to find that the next new-and-improved version comes out two years later)
  • Organizational risk:  when inappropriate policies get introduced, or the very opposite.
  • Community risk: where a procedure or a reagent or equipment can constitute a safety risk to staff or the public. 
  • Reputational risk:  Even in the public sector a black eye is a still a black eye.  Think about those jurisdictions that have been tainted by tumor marker or HIV and Hepatitis B or infection control scandals.  All too often the reputation damage goes on for years making it tough to get but public confidence or to recruit the staff that you want.  And the staff whose names get tied to a mess can pretty much shrink their expectations for an upward career.
  • Financial risk: associated with investing in equipment to perform a specific test only to find out that either the test has no interest for users, or there is lots of interest but the insurance bodies decide that the test will not be reimbursed. 
  • Fiduciary risk: when things go sufficiently wrong to the point where all sorts of bad legal things happen. 
  • And risk from the unknowns:  (Think Donald Rumsfelt – there are the known risks, there are the risks you know but don’t have information, and then there are the unknown unknowns where you don’t know what you don’t know).

Taken as a whole you could argue that taking on new tests and new equipment and new personnel can generate enormous risk, but in my experience many laboratories create all sorts of problems for themselves by being attracted to new tests or procedures based more on the number of bells and whistles than on risk assessment.  Business cases rarely take into consideration any acknowledgement or awareness of potential downside risk.

With respect to the unknown and unplanned, Axson writes that bad things happen all the time and “success …is more about how [organizations] deal with the unplanned adverse events than how you execute when everything is going smoothly”. 

Axson makes a number of points that I think not only apply to the business world, they apply to medical laboratories as well. 

  1. If you are taking on a new project, try to plan not only for traditional risk, but think also about non traditional considerations as well (reputation, environmental impact).
  2. Risk management is more about preparation than it is about reaction.  If you aren't looking for bad outcomes, then you probably won’t find them until it is way too late.  You may not know why problems are developing but if you know the problems are occurring (and increasing?) you can jump into action.
  3. When bad things happen, success can be much about quickly identifying and correcting bad decisions. Have some sort of contingency plan structure available.  Have the courage to admit mistakes and quickly take corrective action.

All in all a fresh new look and an old problem.

Originally written December 26, 2011.  Updated July 10, 2012

Monday, December 26, 2011

The problem with Management (1)

I got a great gift the other day, a new book (published 2010) entitled Management Mythbuster by David A. J. Axson (John Wiley ans sons publisher).  It is a great read.   Axson is a banker, finance management, consultant, author of considerable experience who went through the financial meltdown that started in 2008.  I don't know what he was like before, but at the time of his writing he had a pretty cynical edge to this book.  From what I know and understand, that cynicism is well deserved.  

The last several years have not been particularly kind to traditional management.  At the time of the writing of “Mythbuster” the media had already said an awful lot about the problems that management had caused in the financial world.  And the world has continued along that line.  What with the ongoing pressures on sovereign debts of Greece and revelations of pretty gross banker salaries and benefits, and  the whole Occupy Wall Street movement  focus on CEOs and really obscene payments for failure, a lot of the business-as-usual of management has come under considerable scrutiny.  I suspect that that trend is likely to continue for some time yet.  (Noble’s law: Nothing moves Quality more effectively than an angry public.)

What AXSON writes about is not a defence of management practices, but rather he takes a pretty harsh look at really shoddy performance at missions and visions and envisioning and projections and use of consultants.  Quality and Risk and Budget management are not spared. 
First of all let me say that the big part of my own belief system is pretty much in full support.  When I see what has been wrought in Canadian Healthcare with excessive payouts to a slew of pretty unimpressive consultants, and the massive expansion of management positions with little to show on the improvement side, I have to say I was pretty “right-on” with Axson.  But before we all the way to bring in the marauding hoards to storm the hospital executive suites, I think that it is more important to step back and take another look. 
Before we trash all these tools, I think we need to acknowledge that while there are lots of examples of excess, let’s not through the baby out with the bathwater. 
Take Six Sigma for example.  There are many examples of laboratories that have bought into six sigma in a very big way.  I know one laboratory network that has taken on a six sigma staff of more than 10 folks.  All sorts of projects being done, some simultaneously, some back to back, some leading to implementation plans, but most not.  Problem is that all that activity costs a fortune, with salaries and benefits alone, well in excess of million dollars a year.  Throw in meetings, and travel, and implementation trials and that value doubles or trebles.  With all due respect, the chances that any of this will even come close to breaking even is near zero.  That sort of use of Quality tools has more in touch with faddism and cult belief and the use of “OPM” (aka “other people’s money”).    Axson says in essence that this is a tragic waste.
I would not go that far. 
I think the folks at Motorola were on to a good thing.  Six Sigma was a simple extension of ISO9000 which in turn was a basic codification of the knowledge and experience gathered first and foremost by W. Edwards Deming and Walter Shewhart.  Define Measure Analyze Improve and Control the future (aka DMAIC) is just another more complex way of saying Plan Do Study Act.  That’s a good thing.  The problem with Six Sigma as it is practiced in the places that I have seen is that it is always project oriented with little continuity.  That is a bad thing.
If AXSON presents an overall philosophy it is to keep things simple and flexible and rational.  And that is a concept with which I am complete on-board. 
If we start of a point of error awareness and the need to remediate and correct, and we focus on continuous improvement, our focus is more refined, and our opportunities for success are enhanced.  Simple is better.
If you have the time, you will find Management Mythbuster on both Amazon and Indigo on-line.

Friday, December 16, 2011

TDG - a call to action

Sometimes they just get it wrong.

I have been reflecting on a truth that I have held about laboratory Quality (and Quality in general) that the greatest mover of quality is an angry public and the actions of the public action arm (the media, legislators and the litigators). We have good examples of this.  The Clinical Laboratory Improvement Act and Amendments would never have occurred if the public had not been outraged by poor PAP test readings.  The Cameron Commission would not have demanded Newfoundland laboratories start along a quality improvement path if the public had not been aghast at the incompetence of laboratory diagnosis.  There are all sorts of examples that make the point.

But recently I have been aggravated by the other side of public engagement in laboratory decision making when it drives excessive utilization and huge TEEM* expenditures with absolutely no tangible benefit.     

Transportation concerns.  For programs like proficiency testing programs, or clinical laboratories sending samples to reference laboratories, or research sending samples, in particular microbial samples, a major concern is the cost and procedures required to ship a sample container.  This is one of the best examples of governmental excess response to public unknowledged anxiety.  (Unknowledged anxiety is the easy but inappropriate over-response of the public or its representatives to perceptions of risk or danger predicated solely on myth or misunderstanding.    

Any laboratory that is involved in long distance transport of samples by road, rail, ship or air learns about “Transport of Dangerous Goods” regulations, affectionately known as TDG.  In order to send a simple sample one needs to have specially trained and certified personnel, access to special packaging materials, special labels, special way-bills and then pay really outrageous transport surcharges on top of already expensive regular shipping fees.  It is the perfect example of lost time, effort, energy and money. 

I would like to be able to say that all this TEEM is well spent, but the reality is that it is not.  It is near impossible to find even a single accident that all this regulation has prevented. 
I have been railing against the TDG impediment for a long time.  I have been involved in transport issues since the late 1970’s and have been engaged in trying to modify regulations since the mid 1990’s.  In Canada, I have had some successes, but we still are plagued by huge costs and lost time.

Over the years in my program we have shipped approximately 40,000 boxes containing proficiency testing materials without a single reported leak, damage, or exposure event reported.  For programs like UK NEQAS or CAP the numbers would likely be 100 times that much but again one can’t find any examples of published exposure events.  But in the meantime this is costing our laboratories collectively millions of lost dollars.  Even if we push the cost on to the receiving laboratory it is still lost dollars. 

We have performed and reported upon and published transport studies looking at the risks associated with leaks and damage.  We have studied other contributions to the literature.  And we have come to the clear conclusion that the levels of risk are completely theoretical. 

I calculated the risk of an exposure being about 1:250 billion over a 10 year time with adding in estimated under-reporting raising the risk to 1:2.5 billion.  These are both below the level for calculating a risk sigma value.  In addition, for transport that may cross an international border it can cause delays of near a week, which can cause all sorts of sample damage. 

You might argue that what we are doing protects us from illicit exposure to dangerous pathogens, but of course that would be wrong as well because the true illicit plan would be to purposefully NOT add the markings.   Truth be told it would be impossible to know how many samples are transported without any markings.

A few years ago I had a discussion with a senior person in transport administration when the comment was made that if one of my packages containing live organisms was to fall out of the sky from a damaged airplane, then I could be held responsible for an outbreak.  When I commented that the jettisoned jet fuel or lost metal parts from said damaged plane would be a much bigger concern, there was not even a crack of a smile.  And my comments about the risks associated with falling body parts only made things worse. 
So what does all this have to do with Quality?  Well clearly the first job of the Quality Manager is to take their lumps and make sure that TDG training and certification is up to date.  Make sure that the packaging is correct as well as the labelling and resultant paper work.  While you probably would get away without it, the TEEM costs of getting cost would be infinitely worse.

But knowing that TDG is driven by myth and misunderstanding creates an obligation and knowing that the incumbent costs in time and money and potential damage to samples is a call to Quality arms.  Engage your colleagues and start a letter campaign or better yet, a telephone campaign.   Apply pressure, respectfully but directly and with reasonable force.  Seek the exemptions possible as described within the regulation.

Saturday, December 10, 2011

Giants of Innovation

I have been ruminating upon this for several weeks, and so I apologize for being late out of the gate. 

By now everyone is aware that Steve Jobs passed away on October 5, 2011 as did October 11, six days later.  I have, like most, known of both men for years, but never had the opportunity to meet either.  I am saddened by the loss of what these two giants represent, the faces of American innovation.

Robert Galvin was the President and CEO of Motorola, the company that created “wireless world” first with the walkie talkie* and then the cellular phone, and then the cellular phone network.  Steve Jobs had the vision to see the power of these incredible tools and developed his communication powerhouse accordingly.  Jobs’ early computer, the Apple II, was built around the Motorola 6502 microprocessor, and the MacIntosh was built around the Motorola 6800.  Had Motorola not created the foundations, there would be no iPhone, or iPad, or iPod. 

I imagine that these two men knew each other, probably very well.  This adds a certain poignancy to the two deaths.  Indeed one can forgive me for “seeing” their companion deaths as being somewhat akin to the death pairs of close life partners.  That being said, I suspect that it would not have been a particularly “happy union”; the two particularly did not care much for each others company.  They were apparently very different people with Galvin being a warm family man with many hobbies.  Two day the two companies spend a lot of time in the law courts around the world.

So why am I writing about this?  First off, the obvious; Robert Galvin and his link to Quality through Motorola’s development of Six Sigma.  Galvin did not invent Six Sigma, it was the creation of Bill Smith and Mikel Harry, a modern inspiration derived from Shewhart and Demining and Ohno.  But Galvin saw the power of the new language and measure tool based on defects per million.  Galvin had the vision to promote this revitalization of the quality movement.  The characterization of a sigma metric of 5.5 versus 4.1 has provided quality oriented minds a whole new appreciation of the power of simple expression of complex values.

But equally important is the lesson the innovation has many approaches and many faces, and all of them move us forward.  One does not have to invent a new industry, one only needs to see it and improve upon it.  And even if you see the iPhone as perfection, I think what is more important is how it has inspired the next wave of even greater perfections. 

I suspect that neither Galvin nor Jobs thought much about the medical laboratory (well probably Jobs did), but these two innovators are directly responsible for most of our evolution over the last 35 years.  (I set up my first crude laboratory information system on the Apple II).   Near every aspect of what we do, in near every laboratory in the world, benefits from these giants of innovation.   All I can say is “thank you”.

*      A sort of personal interest note: Robert Galvin hired Daniel Noble as the engineer to develop the walkie talkie.  I am unaware of any connections between my family and his.

Friday, December 9, 2011

Policy and the Law - word games?

The word policy is one that requires definition.  According to ISO9000:2005 (fundamentals and vocabulary) the term policy means an “overall intentions and direction of an organization as formally expressed by top management”.  Well if that's how it appears in ISO9000, that must be right. 

But I must admit that I take the definition a bit little further.  I think of policy as a statement of principle established by top management that serves as a foundation for action (processes and procedures)  implemented for meeting the requirements.  Policy exists to answer the question “why do we do that?” 

An example would to address the question “why do we do proficiency testing?” with the statement “because we have a policy that says that we monitor our performance in a variety of ways including PT to ensure that we are meeting Quality expectations” rather than “because we do” or “because I told you so”. 
(I once asked someone in my training days why we always treated all children in a specific way and receiving the answer “we do it that way because that is the way we do it”.)

As opposed to the traditional view of policy, as the apex of a triangle above process and procedure, I tend to think in reverse with policy being the roots that provide nurture and foundation throughout the Quality Tree.

Policy is a form of a top management  instruction or guidance or rule.  Taken one step further policy is a form of  law. 

I mention that because recently I have been reading an English translation of an old book by Frédéric Bastiat entitled “The Law” (Ludwig von Mises Institute, Auborn Alabama).  

Bastiat was an early French economist writing around the time of part of the French Revolution in1848.  (note: many are aware of the first French Revolution of 1789-1799, but know less about the revolutions in 1830 and  1848!)  In many respects the conditions in France of 1830 and 1848 had a lot of similarity to recent events that have led to the “Arab Spring” and the “Occupy Movement”, so reading Bastiat today is not solely of arcane interest.  Lessons may still apply.

In any event, I did a little experiment. 
I took an available electronic copy of an English translation of  “The Law” (
The Foundation for Economic  Education, Inc. Irvington-on-Hudson, New York)  and did a global search and replace changing the word “law” to “policies”.  I then scanned the product and adjusted for grammar.
The results by and large were what you might expect when you start to fiddle with words; a lot of nonsense sentences.  For example:

·        policy is the organization of the natural right of lawful defense (?)

But there were a number that had a certain resonance and relevance.  These include:

Policies cannot operate without the sanction and support of a dominating force, this force must be entrusted to those who make the policies.  (meaning policies are useful if they are understood to be by the authority of top management).

·        It ought to be stated that the purpose of policies to prevent injustice from reigning.  (meaning the reasons that we write policies is to ensure consistency and prevent certain unfairnesses.)

·        Policy is the common force organized to act as an obstacle of injustice. In short, policy is justice. (see above).

·        In this matter of education, policy has only two alternatives: It can permit this transaction of teaching-and-learning to operate freely and without the use of force, or it can force human wills in this matter… (meaning we should use policy as an educational approach for training ).

Additionally there was one that I found particularly interesting:

 “Policies … have acted in direct opposition to its own purpose”
It is not uncommon for folks to have a bad event, maybe a problem with a customer complaint or a lost sample.  In the process of remediating the problem and then starting a corrective action, there is a moment where someone decides that the laboratory needs a new policy like “samples collected on the week-end shall only be collected by an on-call technologist”.  That solved the problem until folks everywhere started seeing an easy way to have samples collected at the laboratories on-call expense.  All of a sudden the policy has created a whole new problem which would have been averted if someone had taken a moment before announcing a new policy. 

All this being said, I acknowledge that this is a bit of a “geek game” playing with words and creating new interpretations.  Not everything is a pearl.  But taken in the spirit by which it was intended, there are some grains of useable message in the exercise.

Friday, December 2, 2011

Cycle Gipe


I used to worry about creating neologisms (new words) because in psychiatry it is said to be a symptom of psychotic behaviour.  But I decided what-the-heck.  English has always been a language in transition.  Besides this is not so much a new word as maybe a new acronym, and we know that English is inundated with new acronyms every hour.  So with that let me introduce my new phrase: cycle gipe.

Walter Shewhart got it right in 1920 when he created the notion of a series of events that follow a regular pattern.  First you Plan, then you Do, then the Check the Results and then do fix through Acting and then you do it again and again.  W. Edwards Deming understood it and modified it slightly; but the concept remained the same.  We can call these good examples of a continuum of activity, but since the point was made that the process should be one of continual repetition, these are good examples of what we can appropriately call cycles. 

But sometimes things don’t work out just that way.  Not only does the process sometimes not go through the repetition process (hence not a cycle) all too often they barely make it through the first stage of continuum.  So do we have a word for that?
Well we do.  Impolitely one might think about another acronym (fubar) but wanting to be a little more positive, I thought about NCC for non-continued continuum.  But I decide to be even more positive with "cycle gipe" (a cycle with Good Intention but Poor Execution). 

I will use it in a sentence.  Our laboratory’s continual improvement program was cycle giped when we stopped doing our internal audits.    

Here are a few additional examples of cycle gipes.

The intra-laboratory continuum failure
A laboratory needs to create and provide and update job descriptions because the job description tells workers what you want them to do and describes their reporting process.  This document can become the employee’s training manifest and their training compliance record, and then form the basis for their competency assessment manifest.    Rather than being a one-size fits all document, it is a record personalized for each employee.  Rather than being written as a one-time document, it is a living document that undergoes continual revision and is repeats through a cycle process. 
I know it sounds like a lot of work, but compare to all the time lost with work confusion, it actually should save time.  And rather than having one part in HR and another with the supervisor, and another in the quality records file, It becomes the one fit-for-purpose document which again saves time and confusion.

So why does this fall into cycle gipe?  I suspect the main reasons are tradition and distraction.  Historically job-descriptions have always resided in the Human Resources Department, and training records live somewhere else.  And even if we do update one part, we usually don’t get around to updating the others.   Or worse one person updates one part, and someone else updates another.  So even if we start off with a unified document, given a few months all the parts get changed.  A good example of “Good Idea but Poor Execution”.

The answer here is simple.   The document is created in one place and is maintained centrally.  All the departments have access to it, but only one person controls it.  It becomes a single document under document control. 

The control is greater, the utilization is uniform and the outcome is better, but the net work involved is less. 

The Cross-System Continuum Failure.

This is a more insidious failure but with much broader implications, and is much harder to control because the responsibility for monitoring success or failure does not lie in the hands of people with the responsibility for creating the process in the first place. 

For the best of intention, a standards development organization (SDO) creates a document that describes from first principles a policy statement.  Let us take for sake of argument and example, a policy about patient identification.  The policy makes sense and takes into consideration certain basic principles.  If undertaken as written and intended, it should prevent or reduce many laboratory errors.  This would be a good thing.  An accreditation body reads the policy statement and it sounds good, and adopts this as a requirement in their accreditation standard. 

So it comes down to the laboratory with an expectation that certain activities will occur.  The laboratory reads the requirement and implements it as policy and instructs that it should be put into action.  So far all is good and the continuum has worked as intended. 

The problem occurs at the next level when put in the hands of the front-line worker the good idea doesn’t work as intended.  Maybe it is a language issue or a cultural issue or a work pattern issue.  Whatever the reason, the process is not implemented.   That should be picked up as a non-conformity through internal audits or competency assessment , but that only works if there is an audit process in place.  And all too often there is not.  So there is no record that implementation has not succeeded.  But that should be OK because the accreditation team should pick up the absence of a record, but that doesn’t happen either.  And monitoring live activity is not something that is done effectively during accreditation assessments.   So the SDO never finds out that the policy is not working, and so with each new version of their document it keeps on getting repeated. 

Ergo, we have another good idea and good intention but poor execution. 
So how often does that happen?   Must be pretty rare, right? 


So here is the message.  Over and over we see that Shewhart and Deming were right.  Continuums and cycles work when we go through a step-by-step process.  But they all have opportunity to fail when we forget or neglect to go through all the steps.   

Plan - Do - Study - Act.