When doing user interviews, do you experience false positive results?

I frequently observe a significant difference between what people say and how they actually behave. Here, I usually refer to user research and feedback. For instance, when a user describes how he uses a certain feature, and we later discover via application data that he never actually used the feature. User interview outputs and “hard” analytics data are products of two different universes.

According to my observations, people frequently want to appear “better” and satisfy you, therefore they tend to be more enthusiastic about the product than they actually are.

Realizing this… Is it wise to speak with your users? And how can the possibility of false positives from interviews be reduced? I’m really having trouble with this.


This will always happen. Ask them about requirements and issues rather than features.

Give them specific scenarios to complete in real tests or at the very least on prototypes if you want to assess the usability of a product.


Piggy backing on this @NatalieSmith, because I concur.

It’s also about how you ask the questions, not just what you’re asking them.


Correct. Additionally, just because you’re speaking with someone doesn’t necessarily mean your research is sound.

Even if your research topic is just to “put yourself in their shoes,” you still need to define it. What do you want to know about their background and how they came across their product? When did they experience your product’s “aha!” moment? Etc.

When asked how frequently they visit the gym, most people will round up. Inquire about particular prior experiences. When was the last time they skipped the gym, not when you’re trying to validate quantitative data? Why?

Determine the assumptions you’re testing; is it the feature’s want ability? Usefulness, perhaps? Is it possible?

The quicker you can find solutions to significant issues, the more specific you are. I frequently observe PMs asking if something sounds beneficial without providing any context. Of course a tool for organisation could be helpful, but what issue is it addressing? What are they doing instead today if it’s issue solving? Why aren’t they doing action to remedy it right now, if anything?

User research is incredibly useful, but when conducting it, you must have a clear goal in mind. If not, all you’re doing is going through the motions. What does success look like based on this conversation? What presumption must be refuted in order to proceed?


Duh, ever heard talk is cheap… Talk is meaningless, show me the data, it’s best to combine both of them. But don’t be surprised if the talk doesn’t match the reality…


I’ve literally thrown away millions of dollars on features that customers have approved and assigned monetary values to, saying things like, “If you build that, we will spend x,” and then justified why they won’t use. The best advice I can offer is to play devil’s advocate and consider what organizational and infrastructure changes would be required for your customer to really make the change and adopt the new functionality. Take those, assume the worst, and weigh it against the advantage they stand to receive from your feature. It is probably not worthwhile unless there is a high benefit low-cost ratio. Though it’s not ground-breaking advice, I’ve previously undervalued the importance of comprehending the adoption process.


The best way to get around this is to cease inquiring about their likes and usage of products. Interview questions like “Tell me about a time when you did XYZ” are appropriate. The XYZ is the circumstance or problem that your product is resolving. To understand why they have the issue and what they currently do to address it, pay close attention to how they resolve it and ask lots of questions. Because changing behaviors is so difficult, it is frequently more beneficial to examine the situation as a whole.


Therefore, the general heuristic is to refrain from questioning them. Is there a method to watch how they act? Give them a task to perform, such as “start an asynchronous chat with the meeting participants,” and watch them successfully complete it or fail to do so. You should have utilized different techniques to determine whether “a sync convo” is something they would like to do.


@MariaWilson, It’s also crucial to emphasize to students before beginning these tasks or exercises that there is no right or incorrect response; rather, if they are unable to solve a problem, it is a testament to the software rather than to themselves. This reminder at the start, in my opinion, aids participants in framing the discussion.


I was just going to mention this @AmyWalkler! Tell them that you don’t care about right or wrong. A task or product that their user type should be familiar with should also be provided. I’ve discovered that typical users produce the best results (not new users). No question new users will experience problems if your typical repeat customers do, but something has kept them coming back.

They will be more willing to continue using if they feel appreciated and that you are listening to them now that you are interviewing them. Just be sure to follow up on any improvements made as a consequence of their assistance. Internal stakeholders are more ready to use my product despite its drawbacks since they enjoy seeing their efforts.


User research specialists put a lot of thought into verbal patterns that avoid this. People will tell you what you lead them to say. So you have to be careful about what you ask and how you ask it.


This is a well-known phenomenon in both user research, and in focus groups and psychology studies in general. It is referred to as “acquiescence bias”, and it’s a natural outcropping of the desire to avoid confrontation.

Your best defense against this bias is to focus on observing behaviors and prompting participants to “show me how” rather than “tell me if.”


Interviews with existing users are fairly low value. The interviews you want are with your competitors customers, and potential customers that haven’t yet bought any solution.

Read “The Mom Test” for helpful ways to avoid biased responses.


@AngelaBlue, Thanks for this insight - the worst is talking with the ones with some relationship to some team member.

I’m experience those problems also with non-users, strangers etc. Everyone wants to be seen some way.

Definitely will take a look on “The Mom Test” :+1:


It’s a quick, enjoyable read, I agree. It significantly changed the way I thought about how to approach discoveries.

@AlbertChappell, I’ve also encountered people whose responses don’t line up with their behavior. I believe it is beneficial to revisit the talks with this information and ask, “Can you walk me through what transpired in this scenario?” or a similar question. Additionally, be sure you are truly hearing what they are saying as opposed to what you want to hear (and I’m not implying you are doing this).


This occurs and will continue to occur. Asking open-ended questions about the subject or flow you are interested in is one technique I have found to be helpful. Ask them how they utilise aimful, for instance, if you want to find out if they have used the “add guests” option in your product. Request a demonstration from them of the procedures they use to create your goods (maybe ask them to share their screen). You might also attempt getting their usage statistics or give them a situation to solve.

Closed-ended inquiries are typically best used for factual inquiries like “do you attend school?”

1 Like

@MarcoSilva, I can see the logic behind this strategy for the current features, but what about potential new features? Is it even necessary to inquire about them or is it only need to test the prototype, MVP, etc.

1 Like

@AlbertChappell, Asking them about their workflow, including the activities they complete and how they do so, will still be effective. Testing prototypes would also be wise if you already have a rough notion of the problem and a workable solution. But it’s important to remember that user testing and interviews are two distinct ways to gather answers to various kinds of queries.

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.