Friday, November 8, 2019

Garbage in, garbage out. (anonymous)

Now, I know this one applies to computing in general. It’d be a pretty easy stretch to apply it to user research, though, wouldn’t it?

I mean, if you don’t get the right users, you don’t get the right data, right? Same thing goes with a crappy test script or a poor prototype as well. 

And speaking of that last bit, I have noticed a huge difference over the years in the quality of the prototypes I put in front of my users. Now, is that because the skills of interaction designers are slowly eroding somehow? Actually, that’s not the case at all. In fact, I’d say those have been steadily improving.

In this case, what seems to be slowly eroding is the quality of the tools they have to work with. Hard to believe that we might be going backward in that regard, but there is no doubt in my mind that Invision, the tool du jour, is a far cry from the prototyping tools I used in the past, ones like Axure or iRise. Yeah, they weren’t that easy to use, but they sure did give me nice prototypes. Invision? To me at least, it seems like it’s maybe a notch above PowerPoint. Honestly, as it stands now, users can’t type in data entry fields! Try getting some realistic feedback with that!

To tell you the truth, it’s the same with some researcher tools as well. Like everyone else, my company is using UserTesting. For setting up a moderated test, it works like a charm. There are some serious issues with unmoderated tests though. For one, I can’t vary order. So, unless my test is a single task, I’m missing something that’s been basic to usability testing since the very beginning. There are plenty of other issues, but to me, not being able to control for order effect is a show stopper right there.

So, what’s the problem here? What is going on? I blame MVPs, minimum viable products. The going model these days seems to be not making a good product per se, but in getting something out there, capturing market share, and making yourself the only game in town. 

All the other stuff that might make your product truly useful and superior? Well, I guess you can take care of that when you get around to it.


Though sometimes attributed to IBMers George Fuechsel and William D. Mellin and dating back to 1960 or so, Atlas Obscura thinks it goes even further back

Friday, November 1, 2019

The computer can’t tell you the emotional story. It can give you the exact mathematical design, but what’s missing is the eyebrows. (Frank Zappa)

I’ll bet Frank never thought this quote would lead into a discussion of moderated vs. unmoderated usability testing. Sure enough, though, that’s what I thought about when I saw this one.

Now, I know some of these unmoderated tools do show you the user (and their eyebrows). The particular one I use, however, does not.

But even if I could see those eyebrows, there’s an even bigger part of an unmoderated test that’s missing. And that’s … me, the moderator.

Having run several thousand moderated usability tests, I know that what I do is a little bit more than just sit there. Now, part of what I do is fairly canned – prep, scenario descriptions, post-task check-ins (“So, how did it go?”), post-test check-ins (“So, overall, how did it go?”) …

I do, however, add some value outside all that. What if the user isn’t talking? What if the user is a bit vague or unclear? How do I probe or follow up? What if they don’t understand the task? What if they go off track? What if the user never gave us feedback on something we wanted them to? How do I reveal the correct answer when the user got it wrong? What if they don’t engage with the system fully? What if the prototype is a little sketchy? What if things aren’t totally linear and straightforward? What if something goes wrong on the technical side? What if, what if? 

Yeah, I know that unmoderated tests are fast, cheap, easy, and – at this point – ubiquitous as well. They’re not, however, for everyone and everything. For production systems – and, for prototypes, single screens or very linear flows –  they’re great. For anything more complex, though, they’re a bit of a gamble.  

I know the world is heading – at great speed – toward faster, quicker, and more automated. Now, that’s all fine and good. I do worry, though, that there still might be some times where we need those eyebrows.


Frank Zappa, taking a break during a heuristic review of some music software