How to run user experience (UX) tests on marketing

marketing user experienceHere are 3 fundamental ingredients for getting head-spinning results from your remote UX tests:


  • Goals – which (measurable) objective are you trying to achieve e.g. reduce PPC spend, while increasing sales? The nature of your goal determines everything else.
  • Tasks – a user has to interact with your marketing for an experience to exist. Tasks allow you to direct (**without controlling**) the nature of this interaction, so you get the types of insights you’re looking for.
  • Audience – who do you plan to influence with your marketing? You should run your tests using people from the same demographic.

To help explain these elements in greater detail, let’s assume you’re trying to improve the user experience of a fashion digital magazine.

If you already know this stuff and wanna check out other parts of this crash course to UX testing for marketers, simply use these links:

  1. Beginner’s guide to UX testing for marketing 
  2. How to run UX tests on marketing (p.s. you’re here)
  3. When to start UX testing for marketing 
  4. Making a marketing business case for UX testing

Which goal are you trying to achieve?

When setting goals, avoid vague statements that are difficult to quantify.

Poorly defined goal: “We want to make the design of our digital magazine more inspirational.”

This goal hinges on abstract and subjective ideas, so it can’t be measured against a tangible outcome.

 Well-defined goal: “We want the design of our digital magazine to be so intuitive and pleasant, that more people read all of it and sign up at the end.”

This is tied to measurable outcomes – average read and sign ups. So, if you’re running UX tests with these goals in mind, you can optimise other elements to help you achieve them.

What’s important when setting tasks for users?

The key thing to remember is you want to simulate for users, as closely as possible, the feeling of making decisions and acting naturally.

Set a scenario for users to follow, don’t just dump them into your tasks

 

Poor scene setting: “Go to XXXX magazine and find a summer jacket to buy.”

This gives users little or no room to think naturally, and is prescriptive about how they should behave.

 Great scene setting: “It’s almost summertime and you’re looking for a magazine that gives ideas on the latest beach styles…”

This puts users in a realistic situation and gives them room to act naturally, making their own decisions.

Avoid asking users survey-style questions (tempting as it may be)

Otherwise you’ll get inaccurate data. What people say they’ll do and what they actually do (in the moment) tend to be very different.

Avoid these kinds of survey-style questions: “Would you buy this pink jacket from XXXX website?”

Instead, draft tasks that put your users in a buying frame of mind (as explained in point no. 1), then see whether users find anything they like and try to buy.

 

We know it’s convenient to be able to say, “10 out of 15 people said they’d buy a jacket from our new site.”

But such statements are mere conjecture – they sound conclusive but are wildly inaccurate. They don’t tell you whether any issues or flaws would, in reality, stop users from buying, once they actually tried to do so.

Avoid “leading questions” – which are built on assumptions wrongly made on behalf of the user

You’ll only force the user to confirm what you desperately want to be true.

Avoid these kinds of “leading” questions: “On a scale of 1 to 10, how pretty would you say the photos in this magazine are?”

This presumes the user already thinks the photos are pretty (or even notices them), and that the only remaining question is in relation to the degree of prettiness.

Instead, set a scenario that involves interacting with the digital magazine’s pictures. Then leave it to the user to comment on the pictures (if at all).

Who’s your audience?

You can get your assets tested by the same types of people to whom you’re trying to appeal.

Our panel of testers has over 30,000 people across the UK, USA, France, Germany and Netherlands. You can find everyone from millennial musicians to postgraduate professors.

To filter out people who don’t fit your requirements, you can ask users pre-qualifying questions (PQQs). Just use the “Advanced profiling” feature when launching your test.  You can set the requirements so that only people who select certain answers can test for you.

Here are helpful gems to remember when setting prequalifying questions:


  1. Stick to a maximum of 5-6 multiple choice options – and only one correct answer. You can “over-filter” and end up with too few users (or none) if you offer too many choices.
  2. PQQs aren’t a test but don’t make it easy for testers to guess what kind of user you’re looking for. Be as neutral as possible in your language, otherwise people may click through just because they want to take the test and get paid.
  3. On the flipside, don’t make the phrasing of your question cryptic or difficult to understand. Some peeps try to get too clever about PQQs but to be honest, you just need to be straightforward and neutral.

Below is an example of a PQQ we used successfully during one of our own projects.

UX testing customer segments

As you can see, we wanted only eCommerce peeps… but we didn’t make it obvious!

You can also use our Private Panel option, where you invite your actual customers to test your marketing.

Ready to take UX testing for a test drive? Click the big purple button below to get started.

Your smartest business move is UX testing.

Try it for yourself – get a free trial showing 3 real people using your website or app, as they speak their thoughts

Timi is a London-based copywriter and full-time marketing sceptic – there are now more unvalidated opinions out there than ever.

He became a UX testing enthusiast after seeing its power while working at TUI – the world’s largest travel, leisure and tourism company. He then joined WhatUsersDo to sharpen his UX knowledge and work side-by-side with the field’s best and brightest.

Leave a Reply