Why (And How) Marketers Should Look Beyond SaaS Dashboards

SaaS Dashboard Data Overload

Photo credit: Ian E. Abbott via Visual hunt / CC BY-NC-SA

With more than 3,500 marketing SaaS providers in the market today, it seems marketers are spoiled by the range of sophisticated technology they can rely on for insight.

Every SaaS service, from marketing automation to A/B testing, and content personalisation to analytics, claims to help us make better-informed decisions and become more customer-centric.

But despite the claims of marketing SaaS providers, their dashboards and reports don’t provide the actionable insight that marketers need to truly understand their customers’ behaviours. They provide metrics (lots of metrics) but precious little that can be acted on with a high degree of certainty. Marketers still need to infer (a polite word for “guessing”) meaning from this data, in an effort to understand their customers.

evolution of marketing SaaS dashboards

MarTech landscape evolution (source: Marketing Land)

5 reasons relying solely on dashboards doesn’t help you better understand customers

1.) Dashboard data doesn’t answer “Why?”

Dashboard data does not answer one of the most important questions – “Why?” It tells you “How many?” “When?” “Where?” and “On which device?” But it can’t provide the important answers that marketers need to replicate success — or avoid repeating mistakes.

Examples of common questions marketers need to answer are:

Why did subject line A outperform subject line B?

Why do so many people bounce from this landing page?

Why are mobile users not converting?

In 2016, it’s not enough to know what happened – you must also know why. This vacuum tends to get filled with guesswork and pointless debates. For example:

“I think mobile users don’t convert because [choose your favourite]:

…they never will, they’re just browsing.

…we need to make search more prominent.

…we need to make special offers more prominent.

…we need more delivery options.

..we need fewer delivery options.”

The dashboard shows there’s a problem… but it does not answer why.

2.) A/B testing is guessing

Most A/B tests are inconclusive, with perhaps as few as 1 in 8 leading to any improvement.

Who’d argue against using A/B testing tools to drive conversions? Not me. But using the results of A/B tests alone to truly understand customers can be dangerous – and may completely bypass what really matters to your audience.

A/B tests should start with a hypothesis. But seeing results from an A/B test  doesn’t mean your hypothesis has been proven (or disproven). Or even that it was the best hypothesis you could have come up with in the first place.

For example, if variant B beats variant A in a test, it doesn’t mean you’ve gained a deeper or even useful understanding of your customers. You have proven that one design element or copy change has outperformed another. But that’s about it.

More worryingly, what if all your hypotheses increase conversion – they never do, but stick with this – while at the same time, you miss those key improvements that could move you beyond incremental conversion uplift? Nothing wrong with the hypotheses or experiments themselves, but what if there are more impactful ones?

A/B testing provides a great ROI (at least to begin with) but should not be relied upon to improve your understanding of customers. And results can soon plateau if hypotheses are merely best guesses at what matters to you audience.

A/B Testing Guessing

Photo credit: Word Games – Hangman

3.) Net Promoter Score (NPS) is not all encompassing

Net Promoter Score (NPS) is now widely used in the boardroom as “…the foundation of a measurement framework that is tightly tied to the customer journey.” (source: Satmetrix) But this single metric can mask underlying problems across digital channels that marketers need to understand, and act upon.

Striving for a single overarching CX metric is admirable (keep it simple and measurable, stupid). But there are two key weaknesses that marketers need to consider.

Firstly, you can have a very high Net Promoter Score but a terrible digital experience. In one of many real-world examples, low-cost airline websites perform very poorly but achieve a very high NPS because of price alone. Maybe that’s fine at a very high level. But as a marketer needing to improve metrics and demonstrate ROI, it can be very restricting if you’re trying to secure budget for a new website (that won’t impact NPS).

Secondly, only customers (those that have bought) complete NPS. It doesn’t account for those that never even became customers to begin with – namely those website visitors that bounced within seconds of landing on a page. This in-built bias of NPS masks what’s really happening under the hood of acquisition activities.

Of course, no marketer would rely on NPS alone as a measure of their performance. But its limitations need to be more widely understood (particularly at board level), if companies want to succeed digitally.

Get a newsletter that isn't all about us...

Subscribe for weekly, hand-picked articles about UX, design, and more every Friday—from the Be Good to Your Users blog and the rest of the whole darn web.

...get a little taste right here. 👅 🍭

4.) Where are the people?

Metrics and dashboards don’t humanise the people (customers) that use products or services. Marketers turn to personas as a means of grouping different customer types and to remind themselves “there are real people behind this” – but personas aren’t actual real people.

Research has shown that greater exposure to customers improves the performance of digital teams because they:

  1. Become more motivated to help real people (customers) by empathising with them
  2. Are solving real problems, not just those inferred (or guessed at) from data

A graph on a dashboard, even if it’s dramatically up or down, is unlikely to motivate a team as much as the process of solving real problems for real people.

5.) Marketers’ competitor blind-spot

SaaS dashboards tell you next to nothing about your competitors. Yes, there’s sometimes sectoral benchmarking, but it provides no actionable insight.

Competitors are a data blind-spot – meaning marketers can’t rely on their dashboards to answer the most fundamental of questions:

Why do people buy from my competitors?

Without insight to answer this, marketers must guess (a less polite word for “infer”) and hope that they hit on one or more of these reasons.

UX testing competitor benchmarking

Spy on competitors – Naruto style!

3 ways marketers get the answers that their dashboards don’t provide

Here are techniques that marketers are using to get the customer insight they need to make informed decisions, rather than rely on hunches.

1.) Observe customers

Simply watch real people use a website, app or find their way from Google Search, and listen to their spoken thoughts as they do so. Watching only a handful of people can reveal the meaning behind metrics, answering the “why” questions that metrics can’t reveal.

Real-life example: Lovehoney

Lovehoney, the UK’s largest online retailer of adult toys, uses UX testing to discover what needs improving on its site, and to validate A/B testing hypotheses.

Tests run include competitor benchmarking, full online user journey (from search to purchase) and multi-device testing.

Video Thumbnail

Lovehoney — user can’t find address with postcode finder

This user is “really, really” annoyed that the postcode finder won’t find his address

By targeting areas customers actually struggled with, rather than guessing what to optimise, Lovehoney enjoyed improvements at multiple stages of the customer journey. As a result, the business has seen a 115% revenue increase, and a 24% uplift in conversion rates.

Whether you call it usability testing, user research or UX testing, it’s the proven way to capture the customer insight that’s missing from metrics alone.

2.) Talk to Support Staff

Often overlooked, support staff hold important answers like:

  • What you should be A/B testing
  • Why your latest email campaign failed
  • Why some customers are switching to competitors

Mining the data from support calls can prove invaluable. But just talking with front line staff to get a sense of trends and current issues is a great place to start.

3.) Set a team “customer exposure” KPI

Experts agree that time spent observing customers’ behaviour is a silver bullet – when it comes to gathering insight to understand their behaviour. Many marketers are now setting themselves and their teams exposure goals, as a metric.

For some, this is a set number of hours each month. For others, it’s a countdown of “time since last customer exposure”, which should never lapse past a certain number of days.

It doesn’t need to be that formal – plenty of marketers hold informal but regular events, to cajole their teams and colleagues into spending time with customers. For example, having Pizza Fridays in the boardroom, while watching videos of customers using their website.

Real-life example: lowcosttravelgroup

Hoteling.com, a part of the lowcosttravelgroup, is an eCommerce business that helps people easily find the hotel they want, at a price they like, on any device.

The company ran a round of international UX testing which uncovered 40+ region-specific bugs. A cross-functional team then reviewed the user testing videos, over pizza and beer on a Friday.

Video Thumbnail

Hoteling — User has difficulty finding their destination in the search results

This user has difficulty finding their destination in the search results

The meeting, which included the product director and product owner, was used to prioritise bug fixing and help connect the development team’s work with actual users.

Data is good – insight is best.

Your smartest business move is UX testing.

Try it for yourself – get a free trial showing 3 real people using your website or app, as they speak their thoughts

Lee Duddell
Lee Duddell is the founder of WhatUsersDo.

During 20+ years of working in digital, Lee became increasingly frustrated with the amateurish way that companies were making important design decisions. Personal opinions, hunches and incomplete data were driving experience design. And not user insight.

Lee started WhatUsersDo to fix this by making user research and UX Testing business as usual.

Leave a Reply