20 questions and instructions to avoid when running a UX test

jellyfish floating in the sea

Who loves telling people what to do?

Dictators and lazy people. Mainly.

Now I know you’re probably not a dictator or a lazy person, but you’re undoubtedly somewhere on the scale. I myself am 30% dictator, 33% lazy, the rest is mainly sweat and ham.

But if you want to get some UX testing done, you’re going to have to get used to telling people what to do. You don’t want people scampering around your website getting up to god knows what, when all you wanted to test was the efficiency of your checkout page.

That’s why you need to write tasks for your UX test. A simple list of pointers that directs test participants to a particular area of your website and app, asks them to perform specific functions and asks them to give verbal responses to specific questions.

Simple right? Well actually there is an art to writing these tasks, especially for remote user testing.

You can ask too many leading questions and end up forcing a user down a cul-de-sac that merely ends up confirming or your own bias. Or you can be too vague, and end up with a load of aesthetic opinions about your site (“I like the hot pink buttons”) – which is great for the ego, but you won’t know anything about its usability.

As a newbie here at WhatUsersDo, I have made both of the above mistakes when running tests for my own training. In fact, I would go so far as to say that I am rubbish at them.

So I asked our fabulous UX research team and the equally fabulous, Customer Marketing Manager, Clare Burroughs for a little help, and they provided me with a collection of genuine examples of user tests that couldn’t be completed because of the way the tasks were written.

They also offered me the following explainer video on how to write a useful WhatUsersDo UX test…

Video Thumbnail

How to write a WhatUsersDo test (for beginners)

… and pointed me towards these 8 tips for writing an incredible UX test.

But hey, when they said “we’ll find you some examples of rubbish tests” that’s all I really cared about.

So without further ado, please avoid the following when writing your own UX tests…

Not paying attention to the order of your tasks

It’s worth double-checking that you’ve written your task outline in the correct order. This may sound obvious, but it’s easier to screw up than you think. especially if you suddenly begin adding additional pointers as you’re setting the tasks.

Look at the finished draft and check whether the correct question follows the relevant action completed. Each task you write on WhatUsersDo is broken up by drag and drop boxes – which is really handy, but if you’re not paying attention, you can knock them out of order.

User Instructions whatusersdo

However one thing you should never do on purpose is write the task in reverse order. We’ve had this once, it made no sense, we’re not sure why the client felt the need to do it – perhaps they love a good chart rundown. Perhaps they were Paul Gambaccini. HEY STICK TO POP MUSIC PAUL GAMBACCINI!

Firing a barrage of questions at your tester

Test participants are encouraged to offer a constant verbal commentary whenever they undertake a task, and you should ask specific questions that relate to the thing you’re testing to make sure you’re getting relevant feedback. But for goodness sake, don’t overwhelm the participant – especially all in one go, long after the user has completed the task.

Here’s a hypothetical example of a barrage of questions for a product that will form the next stage of WhatUsersDo development*

  1. Before today, were you aware of the WhatUsersDo lottery?
  2. Please explain what you know about WhatUsersDo lottery?
  3. How do you take part in the WhatUsersDo lottery?
  4. How do you earn tickets in the WhatUsersDo lottery?
  5. How do you find out how many tickets you have in the next WhatUsersDo lottery draw?
  6. What is the difference between Weekly Tickets and Platinum tickets?
  7. When do the WhatUsersDo lottery draws take place?
  8. When is the next WhatUsersDo lottery draw?
  9. What prizes can be won in the WhatUsersDo lottery?
  10. Have you ever won a prize in the WhatUsersDo lottery?
  11. In general, how do you feel about the WhatUsersDo lottery?
  12. If you were the CEO of WhatUsersDo, what one thing would you change about the WhatUsersDo lottery?
  13. Is it going to far that if you fail to keep up payments, the WhatUsersDo lottery will have your legs broken?
  14. Did you notice the small print saying that the WhatUsersDo lottery now owns your first born child?
  15. What are you doing after work? The WhatUsersDo lottery has some time to kill and is very discreet

*If I had my way

Being insanely specific

Do give some context to the task (i.e. you want to book tickets to see Fast and the Furious 9 at the cinema) but don’t be so ridiculously specific that your task (and indeed product) is only relevant to one person in the world.

Here’s a real-life example that I’ve altered for this article…

“Imagine that you work in a gondola hire company as an admin. The head gondola boss person wants to be able to send out text messages to all previous customers when there is an emergency – e.g. such as the local river overflowing or it freezing over. A colleague of the head gondola boss person has mentioned that a company called iGondol offers this sort of service. You have been asked to find out more.”

I’m in too deep! I don’t know who I am any more! This is just like The Departed! Or better yet…. FACE/OFF! If only my mission was simply, “you want to find out more about a company called iGondol” then I wouldn’t be in existential mess.

Asking for credit card details

Even if you’re testing the checkout process of your website, you should always stop short of asking for payment details. It just isn’t necessary.

Even if you guarantee that you’ll pay a customer back after the transaction has gone through, there are still too many things that can go wrong. And bear in mind that a user will be recorded typing in their card number, and this video can and will be shared with multiple people within our organisation and your own.

Saving your single, all-encompassing question till the end of the task

You don’t have to ask any questions at all if you don’t want to – the point is to observe behaviour that occurs when the tester tries to carry out a specific action. However if you’re only going to ask one question, don’t save it right till the end and ask details about the entire task, minutes after it’s been completed.

“Please summarise the best and worst parts of your experience on our website today.”

If you’ve asked the tester to complete 10 minutes worth of tasks, they are unlikely to remember the exact details of what they liked or disliked at the end of a session.

Instructing a tester to “close your eyes and (without peeking) say what you remember”

This is a slightly controversial one. When you ask a question about remembering something on a webpage, you’re trying to discover whether or not a user has noticed a specific feature of the page – so it may provide some useful insight. However you can arrive at the same conclusion by merely observing their behaviour in a more natural ‘in the moment’ manner, rather than playing a memory game.

Offering a website address and nothing more

It’s crucial to tell your participant what website link to click on in order to start their test. However don’t leave it at that. The user won’t know what they’re meant to do, will click around aimlessly, probably won’t offer anything other than a simple commentary and will mainly just say things like, “well now what?” You need to be more specific with your tests and tell the user what to do or where to go.

Also remember to give them the exact URL of where you want them to start, otherwise who knows what variation of your site or landing page they’ll end up on?

Asking, “What’s your first impression?”

It’s a bit too vague, there’s no guarantee they’ll give you anything usable with this question. Try instead: “what’s the first thing you notice?” then this can inform structural improvements or problems with your visual hierarchy.

Being too technical

Although you probably know what is meant by a ‘hamburger menu’ or a ‘mega nav’ or ‘hover state’, try to avoid using this kind of web design language when asking participants to complete the task. Chances are they’ll just be like your regular visitors and won’t care or know the proper UX terminology.

Get a newsletter that isn't all about us...

Subscribe for weekly, hand-picked articles about UX, design, and more every Friday—from the Be Good to Your Users blog and the rest of the whole darn web.

...get a little taste right here. 👅 🍭

Using single word prompts instead of actual questions or instructions

Avoid this…

  • Homepage
  • Men’s clothes
  • Shoes
  • Purchase
  • Checkout

You’re not writing a Cormac McCarthy novel. Although if you are, you could add the following.

  • Homepage
  • Men’s clothes
  • Cannibals!
  • Shoes
  • Purchase
  • Despair
  • Checkout
  • Moral ambiguity

Asking for an ego massage

Much like the above point in regards to competitor analysis, try to avoid anything that leads people into merely saying nice things about your website, especially in comparison to your competition. You’re not looking for a confidence builder, you’re looking to make improvements of your site based on actual observable insight.

Most test participants won’t be shy about telling you they like something, but the most important question is how usable the thing is.

Writing a task in a different language to what your testers are expecting

We’ve had to stop tests because they were written in a language unfamiliar to the testing panel. We have testers across the globe and offer services in a few international locations, so if you want to run a test with an international panel in a language other than English, that’s totally cool – we can arrange a translation for you. Just make sure you let us know first and budget accordingly.

Also, maybe don’t run your test through Google Translate before submitting, as this way badness and confusion lies.

Not providing enough information for competitor analysis

You can ask test participants to look at a competitor’s website or app to get some useful observations and insight. Just remember to provide the exact instructions for where the tester needs to go. It’s no good asking, “What other gondola hire apps do you have on your phone? Compare them to our own gondola hire app.” You may assume it’s an incredibly popular type of app that everyone has on their phone, but your tester might live somewhere land-locked and have no idea what the hell you’re talking about.

Asking, “What do you like/dislike about this page?”

Don’t worry about whether the participant likes your site or not, as this may just come down to an aesthetic or content based opinion. The only thing that matters is the usability of your site.

Typos and bad punctuation

Seriousley, people wont take you srsly. Also avoid shorthand. And repetition.

Reassuring people that it’s okay to be critical

Don’t worry. They won’t hold back.

Assuming prior knowledge on behalf of the tester

There’s a strong chance that your website operates in a niche – and that’s totally cool, we’re big fans of the niche. You could say we have a niche interest in the niche. But bear in mind not everyone who visits your site will necessarily know what you do or what they’re supposedly meant to understand about you.

So for instance…

You are currently in the market for a solicitor to help you with a gondola related accident claim. After some initial research, you want to find out more about the people who would be supporting you in your claim.

  1. What types of information would you be looking for?
  2. Once you have the information you are looking for, what would be your next action?
  3. Please provide any feedback on your experience.

These questions manage to be equally too vague and too specific. Impressive!

Asking, “What do you think about this?”

You want to observe their actual behaviour when completing a task. Asking for an opinion isn’t a usability test.

Making a test participant point the mouse at wherever they’re looking

We had this once, where a client wanted to test an online PDF. They basically wanted a cheat version of a ‘heat map’. This is a terrible way to go about generating a heat map.

Asking, “Do you feel your contextual understanding would be enhanced if we were to improve the visual affordances of the primary navigation?”

Okay I’m shutting down the test now, thank you, goodbye, leave me alone, never ask me to do this again.

Your smartest business move is UX testing.

Try it for yourself – get a free trial showing 3 real people using your website or app, as they speak their thoughts

Main image by Mikita Amialkovič, additional props to Pete Hornsby for the final example.

Christopher Ratcliff

Christopher is the Content Marketing Manager of WhatUsersDo. He’s also the editor of wayward pop culture site Methods Unsound. He used to be the deputy editor of Econsultancy and editor Search Engine Watch.

Leave a Reply