Tuesday, April 26, 2016

When the VoC is screaming

When the VoC is screaming

Today I received an on-line satisfaction survey from a company that I use regularly.  They are a well-informed “partner” of ours, in the sense that we use their product on a regular basis, and the quality of our program is in part linked to the quality of their product.  That is exactly the type of company to which I will respond.  

These days, every time you go to a store, every time you go to a hotel, every time you travel, you get another survey request, most of which I have found best to ignore, because responding just leads to more surveys.  So I, like I am sure most of you, limit the number of satisfaction surveys that I complete.  I suspect that many of you go one step further and delete them all.

So I open up the survey and the first page has 3 straightforward single answer demographic type questions designed to get some basic information but nothing that could be construed as personal or invasive.  A good start.  The second and third pages are a little longer, but clear in intend with a request for multiple answers, if appropriate.  This is followed by another 4 pages with single questions only.  

And then we hit page 8 and all of a sudden things change.  Now we have 2 grids, each with 5 rows and 6 columns, with a complex set of instructions.  Further both grids are marked as must be completed in order to proceed.  I look to the top of the page to see how far I am into the survey to make a guesstimate on how much more time I am going to have to commit to the survey, but there is not guide provided. 

And my attitude towards this survey starts to change, in a real hurry.  I have three choices.  I can either quit not bother any longer, or I can continue on, row-by-row, or I can send a message.  

And I decide that this is message time.  So I fill in the two grids by random choice just so that I can get to the next page, of which I ultimately discover there are 5 more, all of which get the same sort of treatment.  

On the second to last page I find a text box, in which I comment, “send out crap surveys, get crap information”.  (I found the text box by ultimately getting to the last page and looking for the “Many thanks.  Please leave any additional comments here” box, which did not exist).  Then I hit the “previous” button to find an available text box.

Maybe they read my comment, probably they will not.  What is more likely is that some analyst looks at the collective compiled data, does a bunch of cross-cuts, creates a report which someone reads (or not) and the world goes on.
 The problem with this is that Quality Managed companies actually do need to be able to get information from their customers in order to find out how if they are meeting their needs and requirements.  It is a Crosbyesque as you can get.  

You can get information in a number of ways, one of which is on-line surveys.  There are a lot of other choices, but if done well and carefully, on-line surveys provide a combination of immediacy and directness and distribution that few other choices can provide.  

So if you are going to send out surveys, the least you should do if create them in a manner that optimizes the opportunity for good information and reduces the opportunity for bad information.  

In previous writings [http://www.medicallaboratoryquality.com/2011/06/satisfaction.html ] , I created a number of recommendations to increase your potential for good information, which include:

a.    Focus them to a single issue
b.    Limit the survey to only a few questions , best is to keep it to 5-6 and NEVER more than 10,
c.     Make the questions as uncomplicated as possible.

d.    Pre-test the questions to reduce (you can never avoid) ambiguity and
e.    Make sure that it can always be completed in 3 minutes or less.
f.      Never require an answer. That is a guaranteed invitation to bogus information.
g.    Decide in advance which slice of your audience you are interested in and then only focus your energy on that group. General send-outs are a total waste of time.

Now, some 5 years later, these still seem to be very useful rules to live by, although I would probably now raise the recommendation to limit the questions to 9-10 and never more than 12, provided that you don’t break the 3 minute rule.

In our programs, we continue to rely heavily on our surveys for information from clients, and course participants, and organization members. 

There is no doubt that when performed properly on-line surveys are extremely useful.  When performed improperly they are not only a waste of time and energy, they result in poor information and potential poor decision making.

1 comment:

  1. We normally send surveys in hardcopies to the our different clientele. It consumes a lot of resources and it difficult to get back the filled forms unless you kinda force them to complete. Of course it defeats that objectivity and might misinform us. Online surveys offer flexibility and convinience


Comments, thoughts...