Do you do continuous product discovery in your team?

I’ve learned from Teresa Torres webinars, that this is the future of product research; frequently speak to customers and running rapid experiments to find the right product opportunities to pursue (essential knowing what to build).

It appears Teresa and I are on the same page, before I found out about this concept, i knew current methods for finding product opportunities were too time consuming, expensive and complex to perform on a frequent basis so I began interviewing entrepreneurs, PMs and startup couches about their experiences, this led to developing a SaaS solution to help PMs (like myself) rapidly run experiments (with existing/potential customers) to discover what product opportunities to pursue next.

If you do continuous discovery what software tools do you use to conduct research or tests?

4 Likes

To be totally honest, I’ve never done anything OTHER than what is labelled Continuous Discovery. I’d always thought this was just standard practice.

As a general rule, I’ve always tried to dedicate at least 25% of my time in any given week to discovery processes, even if they’re completely unstructured… Just jumping on a call with a user for 30 minutes and talking about how they’re finding the product and spit-balling ideas for things they’d like to see changed.

4 Likes

@KaranTrivedi, How do you inform your product team mates (or others in the product trio) about your findings? (So, they are on the same page with you, especially when prioritizes are being defined)

3 Likes

@FelipeRibeiro, I think it’s important to separate the kind of conversations you’re having in “continuous discovery” mode vs “intense discovery” mode.

It’s not a “one or the other” scenario, you’re doing both. For me, continuous discovery is about “scouting”, keeping a finger on the pulse of customers, throwing 20 ideas at the wall every week and seeing what sticks.

90% of the conversations end up with very little value being generated, but the 10% that are valuable should ideally then lead to more detailed “intense discovery” where you dig in deep on those particular issues and involve the full trio.

Think of continuous discovery like gold prospecting. You want one person out there checking 1000 different patches for small signs of gold. Once your prospector finds something… That’s when you bring in all the gear and build the mine, no earlier.

4 Likes

It’s less about the tools and more about how to organize things. There’s a ton of ways to do discovery. One myth that seems to be prevalent is that discovery needs to take months and months. it doesn’t. Running quick discover will already allow you to understand small things - and knowing something is better than knowing nothing. Also, as long as you’re always gathering feedback then discovery is continuous anyway. You’re opening yourself up to learning.

The important thing here is what you are doing with all that feedback and how/what you are learning from it. For us, it goes to our feedback backlog where it is linked to problems to solve in our product backlog. This means we have a direct feed and the more we get the more the product team can understand what the problem is, prioritize it, and propose different solutions. (For this we use AirFocus.)

3 Likes

Thanks, you’re definitely right about that myth and running quick discovery (Teresa Torres talks about this at length).

Just to clarify you stated you collect feedback and link them to problems to solve, then prioritize (I assume based on the number of feedbacks linked to the problem to be solve, if I’m wrong, emphasized text I apologize). Don’t you validate the feedback to prioritize it? when I say validate, imagine 100 customers wanted a slack integration, you’d want to know what pain points they had (relative to their request for a slack integration) so you could correctly define the problem (for that particular customer) and how intense it is. if the majority don’t have an intense problem, they experience frequently enough but other feedback comes in (from 10 users) where the problem is intense, how would you prioritise without validating it.

3 Likes

@Felipe, I don’t seek to validate feedback; I seek to understand it. What problem is the user describing? Do I understand it at length? Are they proposing a solution (which tends to happen) or have I asked enough ‘whys’ to understand what the problem is? Validating feedback means you’re silo’ing the solution into a very specific funnel, whereas asking “what problem is this user experiencing” opens you up to various possible solutions. (Try to stay away from bias.) Amount of feedback is important, but it’s not the only factor that goes into deciding whether something gets built or not.

First comes objectives (does the problem align with what we are trying to achieve) - second is the problem itself, do we understand the problem? Third is the ‘why’ - why should we do this? What value might it provide the customer?

Then we go through discovery and understand all the possible problems. As you say, if people are asking for a Slack integration, what particularly problem or pain point do they have that may warrant how we approach this particular integration? It could be it’s just about creating items from Slack, and syncing threaded conversations is a ‘nice to have’ which very quickly can help determine what v1 might look like.

3 Likes

@NathanEndicott, Thanks for correcting me on using ‘validating feedback’

3 Likes

@Felipe, it’s very easy to fall into the trap of ‘validating.’ Don’t validate - seek to understand. Understand the problem, understand the feedback. Validation often leads to trying to validate what you already know, instead of trying to figure out what you don’t know.

3 Likes

How do you keep your product team-mates in the loop about your findings (as Teresa Torres said tech lead, design lead and PM should be on the same page when doing continuous product discovery), so they too understand the needs you uncovered?

3 Likes

Meet with them. Set the time to walk about decisions and what comes next. I know everyone dreads meetings, but how will you otherwise let them know. Set talking points for the meeting, outcomes, and next steps to keep it focused. Also - document EVERYTHING. We use AirFocus to write down the product problem outlines, findings, research, everything. Nobody benefits if things aren’t written down.

3 Likes

@NathanEndicott, It appears AirFocus is somewhat a competitor of mine, although my tool mainly focused on conducting research interviews, observations and surveys, as well as rapidly testing solutions (in a way that is convenient for potential/existing customers).

3 Likes

Maybe complement. airfocus is more of a product/roadmap/prio tool - yours is specifically for interviews. Interviews would not be done in airfocus, that’s what a research tool like yours would do. The data and information would be linked in airfocus, so the product team can have all the data at hand to make those decisions (ie, alongside other feedback, objectives, roadmaps, and additional data points.) Feedback should not be the only source to make a decision, it’s just one of them!

3 Likes

What techniques do you use to recruit customers for interviews? (I’m sure a lot of PMs beginning continuous discovery what benefit from your answer).

3 Likes

@Felipe, It depends. Don’t tie yourself to one method, try different things based on what level of discovery you’re doing. You can use surveys, quick in-app forms, jump on calls, prototypes and usability testing. it really varies based on how deep into discovery you are or what kind of information you’re looking to uncover. In the words of Teresa Torres, there is no way of doing discovery, as long as you’re doing it.

3 Likes

We use AB Tasty for testing small UX changes. We use Appcues as well. And we built in some ability to do larger feature A/B tests into our platform. Plus we do lots of talking to users. We schedule meetings with some new users, have a set of VIPs we talk to frequently, and can source specific cohorts of users to get input from specific types of users.

2 Likes

@SamanthaYuan, What tools do you for extracting insights from interviews? Do you do thematic analysis?

1 Like

@FelipeRibeiro, No tools. We keep a database of interviews in Notion and tag each one with the persona type, demographics, area of inquiry, topics discussed.

2 Likes

@Samantha, I’ve heard of many others doing the same, or using excel. So I assume you manually quantify themes across interviews.

Do you do session recording with tools like Hotjar to identify what parts of your product customers experience problems? Because some customers don’t provide feedback or request for support they just find workaround (I’m guilty of this) because they imagine the process of support would be time consuming.

1 Like

Absolutely. If you love this line of work it’s just in your blood to always be talking and learning about your product.

I work in hospitality and love going out in the field talking with my peers and customers in the front line. A free cup of coffee and a “how are things going” can do wonders to find unsolved and impactful hidden problems. I also find app reviews and in the moment surveys to be hugely helpful. At least for digital products.

There are lots of good tools to distill all that quant data and some of the qual. I just find the feedback to be so much more meaningful when there is a face to it. It makes the storytelling with your teams much more impactful. If you are building a tool like that I’d love to hear more!

1 Like