Task-writing for Usability Testing

// design

Have you ever felt lost when writing tasks for usability testing sessions? Don’t worry, you’re not the only one. Task-writing is a skill, and it needs to be honed over a considerable amount of time as you gain more experience doing it. Although task-writing is just one part of a bigger process, it reflects your understanding (or lack of it) of the end user and the limitations of your tools and methodologies.

Here, I’m going to share some principles that may help you.

First things first

Build your house on rock, and not on sand. The foundation of any UX process lies in the quality of data collected from user research work. This necessitates careful screening of research participants used. Recruiting haphazardly is akin to shooting yourself in the foot and setting yourself up for failure.

If you’re not the one doing the recruiting, be wary when you hear things like “Just get ten people from the next department”. Draft a screening criteria document while closely referencing personas, share it with your team, and then use it.

Do your homework

Don’t skimp on user research work. It may be tedious and dry, and even overwhelming, as more often than not, we get much more data than we can seem to handle. There’s no magic bullet here; UX is hard work. Deal with it.

Assuming you’ve done them properly, personas; mental modeling output, and user journeys are live documents that will guide you throughout your process. Refer to them often, and don’t be hesitant to make changes to them as you uncover more insights.

Going through the motion and churning out deliverables indiscriminately gives you junk ore, and you can’t expect to refine junk ore into gold. Obtaining quality distillate requires top-grade material. This is where a skilled UX practitioner is differentiated from amateurs.

Ask “why”

In the recent UXSG Meetup #19, one of the topics hosted was regarding translating research data into insights. Most people fumble after collecting tons of data, not knowing what to do with them next. It’s not too complicated to get started.

Begin by looking at your research findings. Cull those that have the highest impact to your business objectives (you should already know them by this stage) and ask “why did this happen?”. Form hypotheses, and test them. That’s how you come up with goals for your tasks.

Write goals, not tasks

Here’s an example to start things off:

Goal: To see if users can understand how to use the filters in the faculty search feature.

Task: Show me how you would look for professors only from the School of Computing.

Before we even consider how to word the task, see how writing a goal gives so much more clarity to it. When you are getting your teammates to do a critique on your task list, goals are absolutely indispensable. You provide context for them to compare the task with insights generated from research work done earlier.

Goals also give you a razor-sharp focus in the heat of usability testing sessions where things can and will go wrong. If your participants are not as homogeneous as you would like them to be, there may be times where it becomes necessary to reword your task on the fly. Goals are your anchors.

Ranking and prioritising

List down all the goals & tasks that you have written and rank them. Give them weight. This helps you place them in sequences that flow well according to your test objectives. Prioritising them also reminds you whether to dwell a little more and probe further on certain tasks during testing.

Lastly, be flexible

In the martial arts world, aliveness or alive training describes training methods that are spontaneous, non-scripted and dynamic. The Judo practice of Randori is such an alive training method because of its unpredictability. Judo students practice with many different partners, and because each opponent is different, they need to adjust their styles, timings, speeds, and techniques to each opponent.

Of course, we shouldn’t have ten different versions of a task worded differently. The point is to acknowledge and accept the reality that things don’t always go according to plan. If you’re in the business of developing software for older folks, you might have experienced “grappling” with your test participants as they struggle to understand your meticulously worded tasks. Learn to navigate around your test script as you work through testing sessions, and tweak it whenever necessary.