Tuesday, April 18, 2017

Whereof one cannot speak, thereof one should be silent. (Ludwig Wittgenstein)

How tempting it can be though.

Imagine you’re in a report-out, and your client is eating up everything you have to say. You’ve really got them in the palm of your hand.  Why not just share that pet theory you had? Now, you can really only tie it to that one user. And their quote was actually pretty non-committal at that. But, heck, why not? You’re on a roll! Share this gem, and they’ll really love you.

On the other hand, you’ve also got much less positive scenarios as well. For example, one of your clients might have a great question, but – unfortunately – you don’t really have any data. Maybe it never occurred to you to collect it, and thus never made it into your test script. Perhaps you neglected to probe when the situation came up. Maybe you didn’t recruit enough of that particular type of user to really be able to say.

In fact, that last scenario is something I face all the time. Everyone needs to remember that we are dealing with qualitative data here – and the small number of people that qualitative research typically involves. Now, those numbers might be fine to show some usability mishap (a mislabeled button, a hidden link, a missing step), but when it comes to things that are more preferential, it can be hard to really say when all you’ve got is a couple of data points.

Another issue that sometimes comes up is when it’s just six of one, half a dozen of the other. In other words, there’s data for one side of an argument, and there’s data for the other. Now, you’ve most likely got a client with strong feelings in one direction (heck, you might even yourself). So,  they’ll probably feel just a little bit cheated: “Hey, I paid you all this money. I expect to see some real results. How am I gonna make a decision now?”

Basically, all it really comes down to is how comfortable you are saying, “I don’t know.” Interestingly, though, I’ve found that that will actually generate more respect for you in the long run.


And, yes, I know I’m taking this quote totally out of context  ;^)

Wednesday, March 1, 2017

The perfect is the enemy of the good. (Voltaire)

Usability is not physics. It is not a pure science. 

Heck, even though I call myself a “usability engineer,” I know what I do is honestly pretty iffy engineering. And I should know – I put up with real engineering school for two years before calling it quits.

What usability does, however, have in common with “real” engineering is a focus on practical solutions and on real data. Now, there was a time when that data was pretty darn hard even for usability – basically, error rates and time on task. Usability engineers found, though, that that hard data was lacking an important component. That data really didn’t tell anyone why users failed, why things took so long, what project teams could do to make things better. Softer, more qualitative data, however, did.

So, you may run across clients who still insist on that hard data, especially if they have a quant background (for example, from business or marketing). In that case, you have to sell the richness of qualitative over the certainty of quantitative. And for some clients, you will definitely have to overcome the idea that qualitative data is less pure, less perfect. In those situations, I like to emphasize what we do with the data – in particular, that soft data can be a lot more actionable than hard. (It also typically eliminates the conjecture that actually comes when teams move on from gathering their hard data and then try to interpret what it means and how to come up with solutions.)

A similar issue usability engineers have to deal with has a lot to do with the numbers thing. I cover that in “Not everything that counts can be counted, and not everything that can be counted counts” (which is a quote from Albert Einstein).

Finally, there is the issue of the perfect test. And I’ve talked about that before in, “The test must go on” (I’ve got Laura Klein down for that one). 

Ultimately, the final decision can come down to running an imperfect test or never getting around to running that perfect one. And we all know that there's nothing perfect about the latter.

Usability is really the art of the possible.  We do what we can.  Like I tell my clients, give me what you’ve got, and I’ll be sure to get you something of value in return.




But then again, there’s this!

Thursday, February 9, 2017

Unfortunately, it's hard to differentiate in the software between someone who wants to go on a voyage of discovery, and someone who just wants to open a file. (Peter Flynn)

Now, what’s sad here is that I can almost guarantee that your design team (or marketing partners or senior execs) will err on the side of the former. It can sometimes be very hard for them to realize that this thing they’ve worked on, thought about, and agonized over for months, if not years, is really just a means to an end, a tool that users employ with little actual regard for the tool itself. 

Unless, that is, the tool was designed for some other purpose than to help those users achieve their goals … If, for example, it was designed with someone’s portfolio in mind, or to impress the division manager, or to get recognized in some magazine or on some website. Now, this will draw some attention to your tool. Unfortunately, at least when you’re talking about your users, that will almost always be attention of the negative kind. 

In general, users want tools that don’t draw attention to themselves. To them, your UI would be best if it were totally transparent, even invisible. 

And if your UI needs lots of training, that’s even worse. Note that that includes traditional kinds of training like manuals and videos, and more up-to-date, subtle means like coach marks and what’s-new content.

Now, of course, there are certain user types who do like to go exploring. These users are often techy types, and sometimes really do want to learn the tool and develop a mastery of it. Good for them! I’m not sure we need to develop our system around them though. Perhaps if we just offered some option so that they could go on that voyage without forcing everyone else to. Maybe a link to a tutorial, maybe an expert version of the software …

The important thing, though, is to concentrate on the user and their goals, instead of on the tool. 


Peter is at University College Cork, one of the better schools for usability in Europe

Friday, January 27, 2017

Any intelligent fool can invent further complications, but it takes a genius to retain, or recapture, simplicity. (E.F. Schumacher)

At work, most people tend to get rewarded for mastering the complex. Think of some internal system that only Joe knows how to use, some bureaucratic process that only Cheryl can understand, some hardware that only Trey can fix. Honestly, I’m pretty sure it’s behind why lawyers, accountants, and engineers all make the big bucks.

Unfortunately, for us UX types, it’s just not enough. Sure, the developers can get away with mastering C++; the lawyers with Reg this and Reg that; and project management with some unwieldy, macro-infested, homegrown spreadsheet horror. For us, though, we typically have to take all that complexity and turn it into something that our users can deal with and make sense of.

Thus, we often act as translators. So, not only do we need to learn that difficult source language of technology and bureaucracy and regulation, but we also have to translate all that into the target language of our users. 

Our effort is two-fold. First, we need to master the complex. Then, we need to turn that complexity into simplicity. 

Over the years, I’ve noticed that some UXers are great with that first part, but not with the second. To me, they’ve always seemed like frustrated techies (wolves in sheep’s clothing, if you will). Subsequently, their designs can often be great for themselves – and other techies – but maybe not so much for everybody else.

On the other hand, it’s hard to be a graphic designer without mastering PhotoShop, or an IA without being an Axure wizard, or a writer without knowing your content management system inside and out. What happens when you don’t?  Well, you might very well come up with user-friendly solutions, but you might also have a hard time translating those solutions into something workable. Heck, you might not even be able to fully grasp the complexity of the problem you’re trying to solve from the get-go, leaving out important pieces and ultimately making your solution harder, not easier, to use.

Face it, UX is one of those both-sides-of-the-brain disciplines. If your brain is structured that way, you’ll get a major kick out of both understanding the complex and then turning that it into something simple. If not, though, I can guarantee that at least one side of that equation is going to bug the heck out of ya.


E.F. Schumacher was an economist and statistician, 
but was also the author of Small Is Beautiful

Thursday, January 5, 2017

Instead of assuming that people are dumb, ignorant, and making mistakes, assume they are smart, doing their best, and that you lack context. (Nicholas Zakas)

Actually, I sometimes like to think that it’s the designer (or developer, or client, or HIPPO) who is dumb & ignorant. Needless to say, I also keep that strictly to myself.

Those are definitely my thoughts, though, whenever I hear someone put forth the traditional, “Where do you get these idiots from?” (or something along those lines). How I actually do respond is to point out these are our users, we used a 10-page screener and paid a recruiting agency $1000 to get ahold of them, and that not everyone out there is as smart and tech-savvy as you guys. 

So, that usually takes care of the “smart” part. As for the “doing their best,” we sometimes do have users who are just there for the money, but that’s extremely rare. It’s usually totally obvious to anyone observing that 99 out of 100 users are taking things seriously and are genuinely engaged.

Now, as for “context” … Hopefully, the design team had some exposure to that beforehand. Personas, journey maps, and all that other great upfront research can give the team some real feel for their users – what they do and don’t know, what they like and don’t like, what their goals and fears are – and how to design something just for them.

Even if there has been that exposure, though, I try to push testing an excellent way to get even more context. Even the best upfront research can be incomplete, or neglected, or misapplied. Testing, though, is the chance to really check things out, to get that final word. The more sophisticated teams I work with have no problems understanding that, and often see testing in this regard as simply fine-tuning.

It’s those teams who don’t do any up-front work, and who can be totally blind-sided by things that happen in the lab, that I really worry about. Hopefully, though, these teams can use that experience to learn to emphasize with their users a little more – heck, maybe even do a little of that up-front research and avoid those uncomfortable situations in the first place.


Just in case you were wondering what a HIPPO is