“What features do you want us to add?”
I cringe every time I hear a client ask users this question.
Not because user feedback isn’t valuable—it absolutely is. But because asking users what they want is the least effective way to understand what they need.
In 18 years of UX research, I’ve learned one uncomfortable truth: Users are terrible at telling you what they want. They’ll request features they’ll never use. They’ll say they want customization they’ll never configure. They’ll ask for complexity that will make them abandon your product.
Here’s why—and more importantly, how to actually uncover what users need.
Why Users Can’t Tell You What They Want
They Don’t Know What’s Possible
User: “I wish there was a way to track all my shipments in one place without checking multiple websites.”
What they don’t know: This already exists. Browser extensions, aggregator apps, email parsing services—multiple solutions are available.
The problem: They’re expressing frustration with their current workflow, but they can’t envision solutions outside their experience.
If you build what they described, you might create something that already exists (and does it better).
What you should ask instead: “Walk me through the last time you had to track multiple shipments. What did you do? Where did you get stuck?”
This reveals their actual process, pain points, and workarounds—which gives you insight into what solution would actually fit their workflow.
They Rationalize Past Behavior
User: “I’d definitely use a budgeting feature to track my spending.”
Reality: They’ve never stuck with a budgeting tool for more than two weeks.
Why they say this: Because it sounds responsible. Because they wish they were the kind of person who budgets. Because saying “I don’t care enough to track my spending” feels bad.
This is called stated preference vs. revealed preference.
- Stated preference: What people say they want
- Revealed preference: What people actually do
Revealed preference wins every time.
Real example from a client:
We surveyed users of a meal-planning app:
- 87% said they wanted advanced meal prep features
- 64% said they’d pay extra for custom nutrition planning
- 73% said they’d use recipe importing from blogs
We built all three features.
Usage after 6 months:
- Advanced meal prep: 4% of users
- Custom nutrition planning: 2% of users (most didn’t complete setup)
- Recipe importing: 11% of users (tried once, never used again)
What we should have done: Observed what users actually did with the existing features, not what they claimed they wanted.
They Propose Solutions, Not Problems
User: “You need to add a dark mode.”
This sounds like clear, actionable feedback. But it’s not.
It’s a proposed solution to an unstated problem.
The actual problems could be:
- Eye strain from using the app at night
- Difficulty focusing with bright backgrounds
- Preference for aesthetic consistency with other apps
- Battery drain on OLED screens
Each of these problems has multiple solutions. Dark mode is just one.
Maybe the real issue is:
- Text contrast is poor
- Font size is too small
- Blue light is too intense
- UI is visually overwhelming
You won’t know unless you dig into the problem behind the request.
What you should ask: “What about the current design isn’t working for you? When do you notice this most?”
They Overestimate Future Behavior
User: “If you added team collaboration features, I’d invite my whole company to use this.”
Translation: “In this moment, hypothetically, I think I might do that.”
Reality: When you build the feature, they’ll invite two colleagues. One will ignore the invite. The other will use it once.
This is called the intention-action gap.
People dramatically overestimate how much effort they’ll put into future behavior, especially:
- Inviting others
- Customizing settings
- Learning new features
- Changing established workflows
Real example:
SaaS client wanted to build team collaboration features because “users kept requesting them.”
We interviewed 20 users who’d requested collaboration:
Question: “You mentioned you wanted team features. Tell me about the last time you tried to collaborate with your team using our product.”
Results:
- 12 users: “Oh, I haven’t actually tried. I just thought it would be useful.”
- 5 users: “I mentioned it to my team but nobody seemed interested.”
- 2 users: “We tried using a shared account but it got confusing.”
- 1 user: Actually had a clear collaboration workflow in mind
19 out of 20 users who “wanted” collaboration had never attempted it.
We didn’t build the feature. Saved 6 months of development time.
The Wrong Questions vs. The Right Questions
Here’s how to shift from asking what users want to understanding what they actually need.
Wrong: “What features do you want?”
Why it’s wrong: You get a wishlist of hypothetical features they’ll never use.
Right: “Tell me about the last time you used [product]. What were you trying to accomplish?”
This reveals:
- Actual use cases (not imagined ones)
- Real workflows (not ideal ones)
- Genuine pain points (not assumed ones)
Example:
Wrong approach: “What features would make this project management tool better?”
User response: “Gantt charts, resource allocation, budget tracking, time tracking, advanced reporting…”
Right approach: “Tell me about the last project you managed. Walk me through how you used our tool.”
User response: “Well, I created the project and added tasks. Then I had to switch to Google Sheets to track who was doing what because I couldn’t see everyone’s workload at once. Then I used Slack to follow up because notifications in the app weren’t working the way I expected. Honestly, I mostly use the app just to create tasks, but everything else happens elsewhere.”
What you learned: The problem isn’t missing features. It’s that workload visibility and notifications aren’t working effectively. Fix those, and you solve the real problem.
Wrong: “Would you use this feature?”
Why it’s wrong: People say yes to features that sound useful, even if they’d never actually use them.
Right: “Tell me about a time you needed to [solve this problem]. What did you do?”
This reveals whether the problem exists frequently enough to matter.
Example:
Wrong approach: “Would you use a feature that lets you schedule emails to send later?”
User response: “Yes, definitely! That sounds super useful.”
Right approach: “Tell me about the last time you wanted to send an email but didn’t want it to go out right away.”
User response: “Hmm… I guess when I’m working late and don’t want to email my team at 11pm? But honestly, I just save it as a draft and send it in the morning. Or I just send it—they can ignore it if they want.”
What you learned: This isn’t actually a painful problem for them. They have workarounds they’re satisfied with.
Wrong: “What’s your biggest frustration with our product?”
Why it’s wrong: Users give you generic, top-of-mind complaints that may not reflect actual usage problems.
Right: “Walk me through the last time you felt frustrated using our product. What were you trying to do?”
This reveals:
- Specific moment of friction (not general dissatisfaction)
- Context of the problem
- What they were trying to accomplish
- What they expected to happen
- What actually happened
Example:
Wrong approach: “What’s your biggest frustration with our e-commerce site?”
User response: “It’s slow. And the search doesn’t work well.”
Okay, but… how slow? Which pages? What were you searching for?
Right approach: “Tell me about the last time you got frustrated using our site. What happened?”
User response: “I was trying to find a specific product I’d purchased before. I searched for it but couldn’t remember the exact name, so I tried searching by category. But the category had like 200 products and I couldn’t filter by what I’d already bought. So I went to my order history, but that’s organized by date and I couldn’t remember when I bought it. I ended up calling customer service.”
What you learned: The problem isn’t search or speed. It’s that there’s no “buy it again” feature or purchase history search. Completely different solution.
Wrong: “How likely are you to recommend our product?”
Why it’s wrong: NPS scores don’t tell you why or what to fix.
Right: “Have you recommended our product to anyone? Tell me about that conversation.”
If they have recommended it:
- What did they say about it?
- What problem did their friend have that made them think of your product?
- What hesitations did they mention?
If they haven’t:
- Why not?
- What would need to change for them to feel comfortable recommending it?
- When was the last time they recommended a similar product?
This reveals your product’s actual value proposition (according to users) and what genuinely holds people back from advocacy.
Wrong: “What would make you upgrade to our premium plan?”
Why it’s wrong: Users will list expensive features they want for free, not features they’d actually pay for.
Right: “Tell me about the last time you considered upgrading. What made you think about it? What stopped you?”
This reveals:
- Real upgrade triggers (actual moments of consideration)
- Genuine barriers (price, specific missing features, unclear value)
- Whether they’ve ever seriously considered it at all
Example:
Wrong approach: “What features would convince you to upgrade?”
User response: “Unlimited storage, advanced analytics, priority support, custom branding…”
Translation: “Give me everything for free.”
Right approach: “Have you ever considered upgrading to our premium plan? Tell me about what prompted that.”
User response: “Yeah, last month I hit my storage limit and had to delete old files. I looked at the premium plan, but it was $49/month and I’d really only need the storage. Everything else I don’t care about. So I just deleted files and moved some stuff to Google Drive instead.”
What you learned: Storage limits trigger upgrade consideration, but bundling forces them to pay for features they don’t want. Solution: Offer storage-only upgrade at lower price point, or make storage limits more generous at base tier.
The Best Research Questions (That Actually Work)
After 18 years of user research, these are the questions that consistently reveal actionable insights:
1. “Walk me through the last time you [did the task]”
Gets you:
- Real behavior, not hypothetical
- Specific context
- Actual pain points
- Current workarounds
- Decision-making process
Example: “Walk me through the last time you booked a hotel. Start from when you first thought about it and tell me everything you did.”
2. “What were you trying to accomplish?”
Gets you:
- User’s goal (which might not match what you think)
- Context and constraints
- What success looked like to them
Example: “You mentioned you used our app yesterday. What were you trying to accomplish?“
3. “What did you expect to happen?”
Gets you:
- User’s mental model
- Where your design violates expectations
- What would feel intuitive to them
Example: “When you clicked that button, what did you expect to happen?“
4. “What did you do instead?”
Gets you:
- Workarounds (which reveal unmet needs)
- Alternative solutions they prefer
- Whether the problem is worth solving
Example: “You said you can’t do that in our app. What do you do instead?“
5. “Show me how you currently do this”
Gets you:
- Actual workflow (not described workflow—there’s often a huge difference)
- Tools they use
- Steps they take
- Where they struggle
Example: “Can you share your screen and show me how you currently manage your client list?“
6. “Tell me about a time when that went wrong”
Gets you:
- Edge cases
- Error scenarios
- What causes frustration
- Impact of failures
Example: “Tell me about a time when our checkout process didn’t work the way you expected.”
7. “How did you learn to do that?”
Gets you:
- Whether your product is intuitive or requires learning
- What makes sense vs. what’s confusing
- Where onboarding fails
Example: “You did that really efficiently. How did you learn to do it that way?”
Real-World Example: Redesigning a Document Editor
Client: B2B document collaboration tool, 50,000 users
Their approach: “We surveyed users. They want more formatting options, better templates, and advanced collaboration features.”
My approach: Interviewed 15 users about how they actually use the product.
What Users Said They Wanted:
- More font options
- Advanced formatting controls
- Better templates
- Real-time collaboration cursors
- Version comparison features
- Custom color schemes
What I Asked:
“Walk me through the last document you created using our tool. Start from the moment you opened the app.”
What I Actually Learned:
User 1: “I opened a blank document, but then I realized I needed to create the same structure as last time, so I found my previous document, copied it, pasted it into the new one, and deleted all the content. Then I started writing.”
Pain point: Reusing document structure is tedious. Solution: Quick templates from previous docs, not more preset templates.
User 2: “I drafted the document in our tool, but then I had to send it to my boss for review. He doesn’t use our tool, so I exported to Word, emailed it to him, he sent back a version with track changes, and then I manually updated our version.”
Pain point: Collaboration with non-users is broken. Solution: Better export/import with change tracking, not more real-time features.
User 3: “I was on a call and someone asked me to pull up the Q3 report. I knew I’d created it, but I couldn’t remember what I’d named it. I searched ‘Q3’ and got 47 results. It took me 5 minutes to find it while everyone waited on the call.”
Pain point: Document organization and search are broken. Solution: Better metadata, recent docs, favorites—not more formatting options.
User 4: “I was working on a proposal and realized I needed to check what we’d written for a similar client last year. But I couldn’t search inside documents—only titles. So I had to open like 12 documents before I found the one I needed.”
Pain point: No full-text search. Solution: Search across document content, not custom color schemes.
What We Actually Built:
Based on actual usage patterns (not feature requests):
- “Start from previous document” option - Lets users quickly reuse structure
- Improved search with full-text indexing - Find content inside documents
- Recent documents and favorites - Quick access to frequently used docs
- Better Word import/export - Preserve changes when collaborating with external stakeholders
- Smart tagging and metadata - Better organization for retrieval
Results After Launch:
- Time to create new document: -40%
- Time to find existing document: -65%
- External collaboration frustration: -58% (from surveys)
- Usage of requested formatting features we didn’t build: N/A (because we didn’t build them)
- User satisfaction: +34%
None of the features users “wanted” were the features that actually improved their experience.
How to Actually Conduct User Research
Here’s my process for uncovering real user needs:
1. Observe Behavior, Don’t Ask About Preferences
Instead of: “Would you use this feature?”
Do this: Watch them use your current product. Record where they struggle, what they complain about in the moment, what workarounds they create.
2. Ask About the Past, Not the Future
Instead of: “What would you do if we added X?”
Do this: “Tell me about the last time you needed to do X. What did you do?”
Past behavior predicts future behavior. Hypothetical scenarios don’t.
3. Dig Into the “Why” Behind Feature Requests
When a user says: “You should add [specific feature]”
Don’t: Write it down as a feature request
Do: Ask: “What problem are you trying to solve? Tell me about the last time you ran into this.”
4. Look for Patterns in Behavior, Not Consensus in Opinions
Instead of: “8 out of 10 users said they wanted dark mode”
Look for: “8 out of 10 users reduced screen brightness manually or mentioned eye strain at night”
One is stated preference (unreliable). The other is revealed preference (actionable).
5. Test Solutions, Don’t Pitch Ideas
Instead of: “We’re thinking about adding X. Would you use it?”
Do this: Build a prototype. Put it in front of users. Watch what they actually do (without telling them what to do).
Their behavior tells you if it’s valuable. Their words often don’t.
The Bottom Line
Users can’t tell you what they want because they don’t know.
They can tell you:
- What frustrates them
- Where they get stuck
- What they’re trying to accomplish
- What they do instead when your product fails them
Your job isn’t to collect feature requests. It’s to understand user problems deeply enough to design better solutions than users could imagine.
The next time someone asks “What do users want?”, ask a better question:
“What are users trying to do, where are they struggling, and what are they doing instead?”
That’s where the real insights are.
After 18 years of UX research, I’ve learned: The best products aren’t built by asking users what they want. They’re built by understanding what users actually do—and designing solutions they didn’t know they needed.
Stop asking users what they want.
Start asking them what they do.
The difference is everything.