Highlights from our AMA with Steve Portigal, user research expert and author

Last week on our UX community Slack channel, we hosted a live AMA with Steve Portigal, user research consultant and author of the classic book ‘Interviewing Users: How to Uncover Compelling Insights‘ and most recently, ‘Doorbells, Danger, and Dead Batteries: User Research War Stories‘. 

Steve Portigal

Steve is a seasoned user research consultant and well-known in the field of user research. Over the past 20 years, he has interviewed hundreds of people, including families eating breakfast, hotel maintenance staff, architects, radiologists, home-automation enthusiasts, credit-default swap traders, and rock musicians. He’s also the host of the Dollars to Donuts podcast.

As if that’s not enough, he also gives talks and leads workshops regularly at corporate events and conferences such as Enterprise UX, Euro IA, Interaction, SXSW, UX Australia, UX Hong Kong, and UX Lisbon.

For this hour-long chat, Steve fielded questions covering absolutely anything on the subject of user research, from introducing non UX-people to user research, to dealing with confidential interviewee information, to reducing bias and dealing with the practical formalities of carrying out user research.

Here are the highlights from Steve’s AMA…

Any tips for interviewing users who work with highly confidential information? Any strategies for incentivising such an audience to be open about which problems they struggle with and how they use our software, when almost everything they do is supposed to be a secret? [Timi Olotu]

I think it’s always good to set expectations clearly when arranging for interviews; perhaps in that situation there might be concern – before even agreeing – about risk. I worked with a bank and they had a standard set of text they used in recruiting bank customers, along the lines of “We won’t ever ask for your bank balance or any information about accounts.” Perhaps that was required for regulatory compliance anyway but it served to be very clear about what we were NOT going to be asking about.

I think managing expectations is more important than incentive; but gaining access is also about understanding their dynamic or their relationship to you. Like, if they use your software, then they have a chance to give input and feedback.

I’d also add, I wonder what kind of anonymized experiences you can create, and I don’t mean that to be so fancy sounding, but how much of the interview could you do on paper, with wireframes.

I don’t mean give them a usability test, but can you give them a set of high-level scenarios and have them pick some and choose which to walk through using a sort of simulated or high level version of your experience.

You wouldn’t want to do that for the whole thing, but it could be a component of an interview.

When generating questions for interviews, how do you reduce bias from pre-existing hypotheses? [Amy]

I think bias is obviously a concern but it comes up so often as the main thing people are concerned about. This is totally biased work! We are humans who are the product of our experiences and meeting other people and exchanging the slipperiest of substances: words! How can we not have bias?

I think that having hypotheses is a great thing when going into research. I mean, you put it exactly right – hypothesis. That’s not a closely-held belief, or an aspiration, it’s an idea of what you think might be happening. That sounds like what we’re supposed to be doing.

If there is a belief or a hope or an expectation – be it implicit or explicit – that seems like something we want to get to in research.

Of course there’s a ludicrous way to do that. “Don’t you agree that it’s better now that we have this feature located in this part of the UI?”

That’s a biased form of inquiry, that’s almost abusive of your power. But trying to understand someone’s framework, expectations, preferences, experiences, mental model, etc. from an open and curious point of view, and having that curiosity informed by what you have been already considering about the problem space, sounds like good research to me.

You writings and talks have been a big influence on many folks working in the area of UX – but who are your influences and inspiration, and why? [Rob Whiting]

Our field is packed solid with great people. I keep thinking about some of the over achievers I get to meet and how accomplished they are.

I love Jess McMullin, he is one of the first people to start doing civic design, like YEARS ago, before it became such a big thing in so many parts of the world.

I am a big fan of Allan Chochinov (he wrote the foreword to Doorbells, Danger and Dead Batteries). He started a graduate program in design at SVA in NYC, called Products of Design that kind of takes a big picture look at what design can do. Mentor, friend, inspires me.

I think Kevin Hoffman is really inspiring. Funny, passionate about pop culture and knows everything, cares about people. His book about meetings is going to come out soon (I can say I saw the cover and it looks really cool).

It’s horrible to try and pick people as I’m using recency to come up with a list.

I’m trying to get user testing ingrained into my company’s process but it’s a struggle. It always seem to be the first element dropped when budgets are tight. Any advice on keeping it a part of the process? [Mike Mellor]

I think you’ve got the #1 FAQ about research… that it isn’t supported. I don’t think there’s a stock answer though. But you might wonder – or seek to understand – why is it being dropped? And why was it even being proposed or considered in the first place?

I mean, it’s one thing to say there’s no budget or time… but if someone chose to put it in to begin with, there was a narrative about its value.

If one could understand that better, one could propose an alternative or advocate, with a bit more information. I’m sure there’s some Rhetorical Studies model here I’m not expressing well, but understanding your adversary’s objections seems like one possible persuasive technique.

I’ve long said don’t advocate for the process “we have to do research” but for the outcomes “we have to make sure we understand this issue or this consequence will happen.”

IU059: Figure 9.4

IU060: Figure 9.5

If you want to get into a discussion of timing, you can use the above diagrams as inspiration – lets say the top one is sort of my gold standard, here’s what it takes to do it “all” – but if you want to do it more quickly, here’s how it’s going to look – it may be more or less appropriate but that way you can have a discussion about tradeoffs.

If we only have a day to find participants, for example, then we can’t be too picky, we can’t go beyond who we know right now at this moment. Maybe that’s sufficient.

So even though your question was about doing it – or not doing it – I think looking at ranges of commitments – where zero is in that range – and encouraging reflection on trade-offs – could be good. It’s not about what YOU need, it’s about what the work requires. So don’t take it on yourself. “I’m not allowed” “they won’t let me” – it’s about us, about our shared goal and your expert advice about how to reach that goal.

I’m wondering what strategies and methods you use to analyse data from your interviews. Would you recap after each interview and write down your observations when they are still fresh and then wait for all of them to be over to listen to transcripts? [Edyta Niemyjska]

You describe my preference pretty well. I separate the “processing the experience” and “processing the data.”

After an interview, I might do a debrief worksheet – but typically not. But at the end of each day, I write up a VERY quick paragraph or two about the interviews we’ve done.

It’s meant to be a storytelling exercise, it’s a forced analysis (take a large thing and pull out some smaller bits) – and it shares the fieldwork with the rest of the team. Here is a PERSON, they have a NAME, they own a THING, they told us an EXPERIENCE. So it helps me make a first pass at distilling and it gets people to think about these real actual people really quickly.

It’s not field notes.

I try to do it in just a few minutes, and do it stream of consciousness.

When we’re done with fieldwork, I like to sort of collate, very quickly pull together a topline – here’s what we think we’re hearing., what did you all think, what did you all hear. We started with questions and we have some thoughts, we have some weak signals, we have some things we’re excited about. Nothing about what to DO with this info, just where we’re at, at the moment.

And then, finally, let’s dig into the transcripts and see what actually happened.

What is the most effective way to ask simple questions to better understand where our users are coming from? Often, the users are ill at ease and want to “help” or are simply biased so they muddle the actual answer. [Karunakar Rayker]

Part of your question is about building rapport. People are often ill at ease at the beginning of a session. They want to do a good job and they don’t really understand what is going on, I mean not to be patronizing, they understand, but they don’t “get” what this exchange is meant to cover.

It’s one reason why super short interviews are challenging because it’s hard to get to a point in the relationship where you have established a smooth dialogue, where the person is not only comfortable but excited, reflective. That takes time, sometimes a huge amount of time – and people are unique, and the way we find a connection is unique to the combination of them and us.

The ways we have to contact with people ahead of time – before the interview – can be rapport builders. Maybe we have a quick phone call and let them ask any questions. Maybe we give them an exercise so we can see something about them. Exercises also prime people – it engages them in thinking about the topic so when we meet they aren’t coming up to it raw and fresh and new, it gets them involved.

Sometimes we make sure in our recruiting process that we screen out people who aren’t already meeting a certain comfort level – “the articulation screen” – if the person can’t answer a question from the recruiter (tell us about a recent experience you had going to the movies) for a few sentences, then they may not be the best participant for the study.

But assuming they are “articulate” – they may not be comfortable. So our job is to keep listening, to keep affirming. I do NOT mean “okay great! Cool! Wow!” etc. I mean listening, I mean, asking follow up questions, expressing interest, validating that their point of view is important because you give it time – that’s a harder way to validate the person because you want to do MORE, but when you do that enthusiastic thing you are actually pushing them to perform for you.

Finally, when you have an uncomfortable person YOU feel uncomfortable, you are sensitive to the cues that this person is feeling weird and I should probably do something different or leave. What if you could ignore those cues – which are about YOUR feelings? And just keep listening and focusing on them?

Can you share examples of what type of exercises you have the user complete when connecting with them prior to an interview? [Anne Jackson]

Come up with something that seems relevant, and so many different ways to go about it. But an approach can be ask them to take a picture of two different things, and send the pictures along with an explanation. Two, because it’s about examples of contrast.

Send us a picture of something in your neighbourhood that you think adds value to the experience people who live there have. And explain why. And send us a picture of something that detracts from the experience.

Send us something you’ve organized well. Send us something you wish was more organized.

These are kind of digging into the theme you suspect the session will get into.

The interview kicks off by getting them to tell you that story again!

It could be fill out a form and give a couple of examples, but the photo stuff can be fun. A screen shot, even.

I am currently in the process of introducing a lot of PMs and Engineers to customer interviews. We are also training a few younger designers to talk to their target users.

What are some basic strategies and tips to keep in mind when introducing non UX researchers to UX research? [Nachi Ramanujam]

I write – well, scribble – on the paper. I might draw a big circle around the quote that is interesting and then write my own thoughts, “Why does she do this” or “they don’t have alignment between their goals and their choices” etc.

This formative UX study is a bit more tactical but might be really helpful – it’s so well explained (not specifically transcripts but at least about analysis in general – I think less about synthesis – where we take small bits and put them into new ideas and frameworks – which is what I think we do with the big mess of annotations I’m producing).

My consulting guide to fieldwork is a one or two pager that is meant to help people do well when they are joining in (NOT LEADING) user research interviews. It is the most boiled down set of points I have. I think it’s like anything, the more you put in, the more you get out.

Here’s a 40 minute presentation that is about doing research. Do they have 40 minutes? https://www.portigal.com/speaking/ has a bunch of links to past talks so you can see where there are videos and slide decks.

Video Thumbnail

Interviewing Users: Uncovering Compelling Insights by Steve Portigal

I also do a workshop where I ask people to interview each other and then reflect on what worked and what did not work. Practice – in a safe place – not on a work problem but on a practice problem.

For more insight, we interviewed Steve Portigal earlier this year about his user research war stories.

Join our UX Community on Slack

Ask and answer questions, take part in exclusive AMAs, share your own experiences... Discuss all things UX with the best and brightest in the industry!

(It’s like a constantly evolving networking event, and even though you have to provide your own refreshments, nobody will know that you’re wearing pyjamas.) ☕️ 💻 🛏

Christopher Ratcliff
Christopher is the Content Marketing Manager of WhatUsersDo. He's also the editor of wayward pop culture site Methods Unsound. He used to be the deputy editor of Econsultancy and editor Search Engine Watch.

Leave a Reply