Thursday, January 12, 2012
Proficiency Testing and Gram Stains.
I have a recurrent dilemma. I know there is a best answer but I wish I were more confident in sorting it out.
Setting: In our microbiology proficiency testing program we create samples that look like and act like true clinical samples. One of the sample we create is glass slide smears for microscopic analysis using Gram stain (For non-microbiology folks this is a 6-step procedure using 3 principle reagents that results in clear visualization of bacteria and yeast and host materials including cells and mucus. It is a classical component of infectious disease diagnosis.) Because we make the samples from their constituent components we know exactly what is in them, and by extension what is not. We then confirm those results through repeated quality control.
In some samples we will add in bacteria and other times we will not. In some samples we add neutrophils and other times we will add lymphocytes and others we will add squamous epithelial cells. Sometimes we will add a mix. Generally neutrophils signify acute inflammation, and lymphocytes signify chronic inflammation and squamous epithelials signify inclusion of surface cells sometime associated with contamination.
Challenge: Near every time we create a slide with neutrophils and no other cells there will be some folks who will report either a mix of neutrophils and epithelial cells or more rarely epithelial cells alone.
Why this is a problem: Neutrophils do not look like epithelial cells and epithelial cells do not look like epithelial cells. So reporting the presence of epithelial cells is clearly an reading and interpretation and reporting error. Some on the committee believe this is a serious problem and should be pointed out as an opportunity for improvement and remediation. I can certainly support that.
On the other hand: Others point out that while reporting the presence of epithelial cells is clearly wrong, it is not a big deal because a clinician who receives the report is not likely to consider this as an important aspect of the slide report and as such other than pointing out the discrepancy there should be no penalty. I kind of understand that too.
But again on the other other hand: an important role of proficiency testing is to pick up a competency errors and promote improvement. And I support that too.
So the tension is between technical competency on the one side and the consequence of error on the other. Without going too far into details I can say that this is not only a technical error seen in small remote laboratories, indeed it is reported across the whole spectrum of size and complexity. For some of the laboratories it is a repetitious reporting style while for others it is a one-off event. I suspect that the error comes from both the inexperienced readers who may take their time and still have interpretation problems, and the very experienced that do quick reads and quick reports.
On my side it comes down to a principles issues, is Quality about being RIGHT even when it is of trivial consequence (that sounds a lot like Henry Ford’s “Quality is doing things right, even when no one this looking) or is Quality about being RELEVANT and accepting or at least tolerating minor discrepancy (It drives me crazy when people require track changes to be displayed to demonstrate that people have not made cross-outs.)
I guess in my mind I don’t think that microscopists who mistake these two cell types are likely to read benign cells as malignant or vice versa. But I could be wrong.
Personally, I am really torn.