Monday, October 9, 2017

The most valuable question could be the one that’s not asked. (Anonymous)

And that’s why I’m not so crazy about surveys.  Sure, you can include an Other field and, yes, you can test your survey out beforehand.  Heck, you can even do some qualitative research (e.g., a focus group) upfront to help guide the creation of your survey.  But there will still be plenty of things you miss, par example:
  • Respondents may not be able to express their issues in words.  Believe me, “This sucks!” and “I hate this!”  are not going to provide you with a lot of actionable data.
  • They may also have gotten so used to work-arounds that they might not think to ever bring up the issue the work-around is for.  Along those lines, they may have also no idea that something could even be better.  
  • More importantly, they may simply fall prey to the limits of human memory, not remembering, for example, that terrible issue that came up last month and that they’ve successfully managed to totally repress since then.  
  • Finally, you simply cannot assume – no matter how diligent you’ve been – that you have encapsulated their whole universe.  Unless you are a user yourself – and your team, further, represents every persona for that user out there – there will still be plenty of things you just can’t possibly foresee.

How to get around this problem?  Well, why not go straight to the users themselves?  In particular, why not let the user tell you or, even better, demonstrate how they feel, think, and behave.  That’s where in-depth interviews, usability tests, and ethnography come in.

Yes, I do typically have a list of things I want to cover when I do these kinds of studies.  What I’ve found, though, is that, if it matters to the user, they’ll be certain to let you know.  And this will come, not from your asking about it, but from them volunteering – either free-form or prompted by some specific task they’re doing.

Now, if something does not come up – and my team really wants some feedback on it – I will probe.  I always tell my team, though, that this is really an instance of a bird in the hand being worth two in the bush.  In other words, if the user brings up or uncovers something on their own, that’s going to be a lot more valuable than if I force them to come up with a response.  

In fact, I usually tell my clients that there is a definite hierarchy of feedback, based on how the feedback came about:
  1. Independently, through a task
  2. Independently, through interviewing
  3. Through probing, but only at the highest level (e.g., “How did that go?”)
  4. Through probing a little deeper (“Any thoughts on page X?”)
  5. Through direct questioning (“What did you think of field Y?”)

Note that I would never go any further than that last instance.  In other words, “Did you understand what that was for?” “Is that link prominent enough?” and “Is that the best place for that button?” are simply leading the witness, and I won’t stand for them.

Now, do surveys have a purpose?  My, of course, they do.  If it’s numbers you’re after, surveys are up there with web analytics and A/B testing.  Note, though, that quantitative methods may all lack two very important things:
  • The ability to follow up
  • The ability to get at the real reason why

And that’s why I usually recommend triangulation – quantitative and qualitative, surveys and tests, analytics and interviews …  And, believe me, that should about cover it all.