Role
Strategy Advisor
Duration
Part of 6-Month Engagement
Industry
QSR
Focus
Measurement & Analytics
The Challenge
A multi-brand QSR company had a measurement problem hiding in plain sight. They had plenty of dashboards, but none were comprehensive, none served as a source of truth, and many weren't accessed by the right people or updated regularly.
Across brands, teams used different definitions and measurements for the same things. Leaders had conflicting opinions about which metrics mattered. The digital team couldn't connect their work to business goals, and without alignment on what to measure or why, prioritization was driven by opinion rather than evidence.
What I Did
I interviewed every leader who had a stake in digital measurement, collecting the metrics each team needed and documenting how they were currently defined and used.
- Identified duplicates and overlaps across brands, reconciled conflicting definitions, and established a single set of standardized measurements
- Tied every metric to OKRs and business outcomes, pushing back on vanity metrics that didn't connect to goals
- Defined a process for requesting data from the analytics team and for creating new dashboards
- Worked with the insights team to prepare them for the new request process, and helped the data dictionary team make updates
- Specified what dashboards should exist, what each should include, how often they should be updated, and who should have access
- Delivered a full implementation plan with periodic review cadences, handing off a ready-to-execute framework before the engagement concluded
The Frameworks
I created two core frameworks to guide how metrics were categorized, prioritized, and organized into dashboards.
The Indicator Spectrum
Classifying metrics along a spectrum from effort to outcomes to insights — and prioritizing dashboard data starting from lagging indicators mapped to business goals.
The Metrics Hierarchy
Grouping metrics by purpose to define dashboard types and guide prioritization when requesting or building dashboards.
The Transformation
Before
- Different brands using different definitions for the same metrics
- Dashboard sprawl with no source of truth
- Dashboards not accessed by the right people or updated regularly
- Leaders disagreeing on what to measure
- No process for requesting analytics or creating new dashboards
Delivered
- Unified measurement definitions across all brands
- Every metric tied to OKRs and business outcomes
- Dashboard specifications with ownership, access, and update cadences
- Defined processes for data requests and new dashboard creation
- Full implementation plan with periodic review schedule
The Outcome
The engagement produced a comprehensive, implementation-ready measurement framework that the team could execute on after handoff. Leaders across brands were aligned on what to measure and why, with standardized definitions replacing the inconsistencies that had made cross-brand comparison impossible.
The framework included dashboard specifications, data request processes, a updated data dictionary, and a periodic review cadence to keep measurement aligned with evolving business goals. The team was positioned to move from fragmented, opinion-driven reporting to a unified, outcome-focused analytics practice.
Key Takeaways
Alignment before dashboards
More dashboards don't solve a measurement problem. If leaders can't agree on definitions, no amount of reporting will create clarity.
Talk to the people, not the data
The framework came from interviewing every stakeholder who mattered. Understanding what each leader actually needed revealed the overlaps, gaps, and vanity metrics hiding in the system.
Design for handoff
A framework only works if the team can execute it after you leave. Implementation plans, process definitions, and review cadences make the difference between a deliverable and lasting change.