How do you get information from interviews for usability tests?

Building the MVP for a B2B SaaS product while handling only 6–8 interviews. My manager suggested that I record them so I could learn from them. This is helpful for valuable insights but requires a lot of work. unable to use a tool since they spoke in their native language.

Curious to know how do others do it?

9 Likes

Here is my (very broad) approach, which assumes that your usability evaluations involve asking users to complete specified activities rather than doing a “free form” study.

I make a matrix for each participant with the following tasks listed in columns and rows:

  • Could one accomplish the task? (Objective yes or no)

  • How firmly did they carry out the task? (Subjective based on your perspective, but you could also balance this with more objective measure like measuring how long it took)

  • Soundbites: direct quotes from the participant that are objective and word for word

  • Observations: Your unbiased notes on what you observed (they don’t have to be comprehensive; just include anything that stands out as significant).

  • Your own subjective notes about what transpired and why the person encountered difficulty (I like to push myself for more than one hypothesis rather than narrowing down)

  • Ideas: Any suggestions for specific adjustments that could solve issues if the person reattempted the assignment (and tie back to hypotheses)

  • If nothing leaps out for some people, there might be blank cells, and that’s okay.

Then combine this data for all participants and concentrate on areas with a high degree of similarity.

People who take a more general approach to insights often introduce a lot of their own bias into their observations. I prefer to take a very structured approach and purposefully separate out objective, subjective, and ideas-for-changes.

9 Likes

@DhirajMehta, Thank you so much. I love how detailed this structure is, will try to implement.

8 Likes

Use the behavioural analysis matrix provided by Monzo.

It focuses on how users behave and whether they can complete important user flows.

8 Likes

You watch each interview ten times if you have six or eight.

  1. Don’t listen to what they say about the instrument; instead, observe their facial expressions and body language. These specifics are lost in transcription.

  2. Users frequently tell lies. You must research the reasons behind particular comments they make in order to understand why they do this. It is very hard to go back in time and do this.

  3. The ideal strategy is to read The Mom Test, then revisit and conduct further interviews. I believe a good quantity is between 20 and 30.

7 Likes

Dovetail is what I use as a UX researcher to analyse interviews. greatly reduces time. You can highlight the transcription after the tool has finished transcribing the video recordings. To aid with the native language issue, you can also add in your own terms that Dovetail will remember while transcribing interviews. After that, you can annotate and highlight your notes and transcriptions, compare them to notes from other interviews, and draw conclusions. For reference, the insights will include the dialogue, notes, and video clips.

I think a personal account is free.

6 Likes

Thank you so much @CathrynCui, will check it out.

@KaranTrivedi, I think I should adopt your method and see how it works for me.

@AlbertChappel, You’re a godsent, many thanks

5 Likes

Hello @Natalie,

There are both short-term and long-term strategies for interviews. You are correct, though, because conducting, transcribing, and regurgitating interviews takes time. Sadly, it is exactly the way that particular beast is.

You are creating a body of evidence for patterns and analysis, just like any other data point. Therefore, you need to be aware that your B2B product workflow would involve conducting more than one round of 6–8 interviews. It’s more likely that you’ll perform three sets of eight repetitions during the start, middle, and end of a large release.

Create an input matrix for the critical intersections in your product workflow. As a result, the rows would be the user list and the columns would be observation instances. Use the same table inputs as much as possible to establish a benchmark for your progress over time.

All the best! :+1:

5 Likes

Are they on Zoom? Use Otter.ai or something similar.

If they’re in person, request an admin sit in with you to transcribe the session.

There’s always a way…

4 Likes

I take notes even when I’m conducting them. Just small stuff and the timestamp, then it’s easier to go back to the important parts and quickly make a summary.

Also, after each session, take a few minutes to write everything down that stood out.
Or
Record the sessions. Watch on 2x speed and then take notes. Where there’s a will, there’s a way…

3 Likes

I also experience the same issue.

I’ll have to review recordings or meeting transcripts if I don’t take notes.

The transcription services like Otter.ai are terrible and still make me rewatch the videos.

I’ve dealt with people that take notes both orally and on paper; I need to practice both to become proficient in one.

2 Likes

So, make it collaborative. Encourage a designer or engineer on your team to accompany you to the interview so they can take notes. It assists in the gathering of insights and the development of consumer empathy. Win, win.

1 Like

Aaaaah! getting the designer or engineer involved! Wonder how you even convinced them.

1 Like

Restrict expectations. The usual procedure is to have someone take notes. Does not really matter who. Check Nielsen-Norman. If they don’t provide someone, you’ll have to analyse every recording, and the analysis will take twice as long. Plus you might have overlooked something.

Give customers the option of good, fast, and quality or lousy, slow, and inexpensive.