Numbers Meet Narratives: Mixed-Methods Study for B2B Product
Uncovered workflow inefficiencies, leading to $2M in potential annual cost savings.
Overview
The research initiative aimed at improving the efficiency of repetitive onboarding tasks. The challenge was to understand and reduce the time B2B users spent completing these tasks within our system, while accounting for the broader context in which users also relied on external tools.
As lead UX Researcher for the project, I:
Designed and executed a mixed-methods study with participants in a global audience.
Identified a $2M in annual cost savings opportunity through time-on-task improvements.
Built a repeatable framework for measuring design impact over time.
Context
In complex enterprise environments, onboarding isn’t just a formality; it’s a critical, high-volume process that touches multiple systems and teams. When I joined this project, the goal was clear: help users spend less time on repetitive tasks and more time on meaningful work. This project highlighted the importance of striking a balance between quantitative rigor and qualitative depth. By focusing on what we could control and contextualizing user feedback, we created a foundation for iterative improvement and long-term impact.
My Role
Enterprise research often means navigating bureaucracy, legacy systems, and deeply embedded workflows. My role was about building trust, gaining access, and translating user realities into design opportunities.
As the UX researcher, I:
Designed and executed a mixed-methods study.
Recruited participants from an approved user base, navigating large enterprise red tape.
Engaged product and design partners throughout the study, from goal-setting to synthesis, to avoid research in a vacuum and drive shared ownership of insights.
I analyzed quantitative data in Excel and synthesized qualitative data in Dovetail.
Presented findings to stakeholders and eventually back to the business groups who use the product, as part of our listening sessions.
Research Objectives
Before diving into methods, we needed to align on what success looked like. Was it speed? Satisfaction? Simplicity? These objectives helped us stay focused and grounded as we explored a complex, multi-system workflow.
Measure how long it takes users to complete key onboarding tasks.
Understand user workflows across both our system and various external tools.
Identify usability barriers and legacy perceptions that impact task efficiency.
Methods
To understand how users really work, we needed both numbers and nuance. Time-on-task gave us the metrics, but interviews provided context. Together, they painted a full picture of the user experience.
1. Time-on-Task Measurement
I conducted usability sessions to capture how long users took to complete specific tasks. While users relied on both our system and their own tools, we measured time spent within our product (where we had control over design improvements) and also accounted for the external tools they needed to complete the task.
2. Moderated Qualitative Interviews
Following each session, I moderated interviews to explore user expectations, frustrations, and perceptions, while also addressing the elephant in the room: legacy changes that had shaped negative attitudes toward the current system.
Participant Engagement
One of the most rewarding parts of this project was hearing directly from the people who use the system every day. I was able to connect with participants across multiple countries to bring a truly global perspective to our findings.
To ensure diverse and representative insights, I conducted remote moderated sessions with 13 participants across multiple countries. These sessions allowed me to observe real-time task completion while engaging in open-ended conversations about their workflows, frustrations, and expectations.
Tools
Dovetail – Research repository and qualitative analysis
Miro/ Mural/ Figjam – Stakeholder alignment, prioritization workshops
Microsoft Suite – Copilot, Planner, Outlook, PowerPoint, SharePoint, Teams
Calendly – Participant scheduling and coordination
Camtasia – Video editing and highlight reel creation
Challenges
Like every project, this one had some expected challenges: dual systems, diverse tools, legacy skepticism, and a small but highly experienced (and opinionated) user base. In every challenge lies a clue.
Dual-System Workflows: Users completed tasks using both our platform and their own systems, making it difficult to isolate total task time. We focused on measuring time within our system to ensure actionable insights.
Diverse User Programs: Each user operated within a different external program, adding variability to workflows and limiting standardization.
Legacy Perceptions: Many users had extensive experience with the product and viewed recent updates as regressions. This required careful framing of research to compare current performance objectively against legacy workflows.
Small, Expert User Base: The user pool was limited but highly experienced, which meant usability issues were often nuanced and deeply tied to long-term habits.
Impact
The real value of research is in what the findings unlock. In this case, we built a foundation for smarter design, clearer conversations, and measurable progress.
Established baseline benchmarks for time-on-task within our system. These benchmarks now serve as a foundational framework for future measurements, allowing the team to track the design impact over time.
Shifted internal conversations from anecdotal feedback to data-driven decision-making.
Anchored quantitative data in qualitative feedback: it doesn’t matter how fast a task is if it feels slow or frustrating. A frustrating few seconds, repeated multiple times an hour, adds up to poor UX.
ROI Impact: Time & Cost Savings
Time Savings Potential: Based on observed workflows, onboarding specialists could save approximately 2 hours per week by streamlining repetitive tasks.
Estimated Cost Savings or ROI impact:
At an average rate of $50/hour, this equates to $100 in potential savings per specialist per week.
With at least 3 specialists per location and over 160 branches, the theoretical savings could be:
3 × 160 × $100 = $48,000 per week
$48,000 x 52 weeks = $2,496,000 per year
Annual cost savings of $2M when the suggested improvements are developed and implemented across all locations.
Reflection
This project reminded me that good research isn’t just about uncovering problems, it’s about building trust, telling stories, and creating momentum. In a legacy-heavy, expert-driven environment, empathy and evidence go hand in hand.