Why UX research is the missing piece in most CX strategies
Kevin Le
CTO · January 20, 2026
Most CX strategies are built around outcomes: improve CSAT, raise NPS, reduce churn. These metrics matter, but they don't explain why customers feel the way they do or how they actually experience the journey.
That's where UX research is consistently undervalued. Too often it's seen as a product function — usability testing, feature validation — rather than a strategic input for customer experience. When that happens, CX strategies become reactive, driven by dashboards instead of understanding.
What UX research adds to CX strategy
UX research answers the questions that metrics can't:
| Metric tells you | Research tells you |
|---|---|
| CSAT dropped 8% this quarter | Customers find the new ticket form confusing |
| NPS is flat | Customers love the product but hate the onboarding flow |
| Churn increased among enterprise accounts | Implementation timelines are consistently missed |
| Average handle time is rising | Agents can't find customer context fast enough |
Research methods like contextual inquiry, agent shadowing, and usability testing surface the specific friction points behind the numbers.
Three ways to integrate research into CX
1. Connect experiences across the journey
CX teams often optimize isolated touchpoints — a support call here, an onboarding flow there. But customers experience the whole journey. A smooth support interaction can't erase the frustration of a confusing onboarding process.
Research connects the dots by mapping how customers actually move through your product and support system. It reveals the transitions where context is lost, where customers hesitate, and where they give up.
At buttercream, we use conversation analytics across the full customer journey to surface these hidden friction points — the moments where a ticket was resolved but the customer still left frustrated.
2. Replace assumptions with evidence
CX organizations are rich in data but often poor in understanding. When dashboards don't explain a trend, teams fill the gap with guesswork.
Research methods that ground decisions in evidence:
| Method | What it reveals |
|---|---|
| Thematic analysis | Recurring pain points across hundreds of tickets |
| Behavioral research | Where and why customers drop off or escalate |
| Agent shadowing | Where tools slow agents down or create confusion |
| Usability testing | Whether a self-service flow actually resolves the issue |
| Co-creation workshops | What customers actually want, before you build it |
3. Bridge strategy with product reality
It's easy to promise "seamless service" in a strategy deck. Research tests whether that promise holds up in practice.
When buttercream's AI agent drafts a reply, does the agent trust it enough to send it? When a customer hits the help center, do they find the answer? When a ticket escalates, does the next agent have full context?
These are research questions, not dashboard questions. And the answers determine whether your strategy delivers or just looks good on slides.
Making it practical
You don't need a dedicated research team to start. Three high-impact approaches:
- Shadow 5 agents for an hour each. You'll learn more about your support workflow in 5 hours than in 5 months of dashboard reviews.
- Run usability tests on your help center. Give 10 customers a real problem and watch them try to solve it. The gaps will be obvious.
- Analyze escalation paths. Look at tickets that required 3+ touches. Map the journey and find where context was lost.
The goal isn't to replace metrics with research. It's to use research to explain the metrics — and then design experiences that actually move them.