Tuesday, February 23, 2016

Just because no one complains doesn't mean all parachutes are perfect. (Benny Hill)

Does your organization make use of voice of the customer (VOC) systems? These typically combine traditional ways of getting customer feedback like surveys, give feedback links, and so on and display them all in one system. They usually try to automate everything as well, using things like textual analysis and machine learning. They also tend to feature lots of cool graphs, a snazzy dashboard, and all sorts of bells and whistles.

Personally, I think they’re wonderful. Hey, it’s all feedback, right? 

At the same time, though, I have run into a number of people over the years who seem to rely on this kind of feedback almost exclusively. Now, the various methods that make up a VOC system do have a number of strengths (numbers is always at the top), but they also have a number of drawbacks as well. 

So, what are some of those problems? I see three issues on the user’s end:

Knowing – First of all, the user needs to know that there’s a problem. In the lab, I often see users who think they have completed a task, but who actually have some steps remaining. I also sometimes see users complete another task by mistake, but be totally unaware that everything isn’t just peachy-keen. 

Another issue is work-arounds, a special problem for experienced users. They may be so used to doing things a certain way, they may not even be aware that their experience might have some issues, let alone complain about it.

A special issue for surveys is human memory. There is often a major time lapse between when a user has an experience and when they get surveyed. The chance of them remembering specific details can often be very low.

Articulating – Second, the user has to articulate the problem. Note this is not as trivial as it may seem. Believe me, I’ve been doing this for 30 years, and I still struggle figuring out exactly what went wrong in some instances during a test. Is this an example of poor cognitive modeling, Fitt’s Law, progressive disclosure, skeuomorphism? Now, imagine some grandma from Iowa trying to do something similar.

What you often get instead are very non-specific comments. Just as an example, it truly is amazing over the years how many times I’ve seen my particular favorite, “This sucks!” Not a lot of actionable information on that one, huh? (Just as an aside, one major strength of usability testing is that it allows follow-up to these vague responses.) 

Caring – Finally, the user has to care enough to share what they think with you. And that’s where those traditionally low response rates come from. In fact, would you believe that some companies are actually happy if they get a rate of 0.5%? Wow, how representative can that be?

So, who does fill out these things then? Well, there is typically a fair amount of self-selection going on. You might get haters, or fan-boys, or the terminally cranky and hard to satisfy, or the desirous to please. 

And that too is another benefit of testing. Though a test almost always involves small numbers, you do know what every one of those users thinks or would do – even if they would never respond to a survey or click on give feedback.

A final issue with caring is what I would call a threshold issue. In other words, with a VOC system, you’re probably going to get a lot of highs and lows. If, however, something was not a huge disaster – or, alternatively, a life-changing experience – it’s probably not worth reporting. 

In fact, you might well run into the death-by-a-1000-cuts syndrome. Just imagine several million users who have a couple of lower-level issues every time they log in to your system, but never enough to actually complain about. Now, imagine another similar system that comes along and doesn’t have any of those low-level issues. Imagine, further, that all those users leave you for that system overnight. What would you then have in hand that would give you any idea why that happened (or – even better – that something like that might be going to happen in the near future)?

On the opposite end of the spectrum, you can get something akin to Benny Hill's parachutes. In fact, one of my favorite clips of all time was when I was doing a test on a production system. At the end of a particularly trying task, a survey popped up. If I can remember correctly, the user said something along the lines of, "If they think I'm going to fill out their %#$@ survey after that @*#$% experience, they've got another &@^#$ thing coming."

In sum, VOC systems are wonderful, but they can involve their fair share of missed signals and false alarms. To make sure they are more than a glorified suggestion box, it can be helpful to triangulate their findings with other sources of data – web analytics, call center data, and ... even usability testing. 

Benny Hill, famous comedian. ladies man, and usability guru

1 comment:

  1. Great Post, Cliff. Very insightful.
    (Also for expounding Benny Hills' usability doctrine.)