It is planning season. Budgets open, and product teams everywhere walk into the same trap: The Excel Roadmap.
Leadership asks for an annual plan. Product managers diligently gather requests from sales, support, and customers, organizing them into themes. They allocate their engineering capacity to the different themes based on perceived value and effort, resulting in a structured roadmap. The organization then aligns on timelines and milestones.
However, by mid-year, the reality often diverges from the spreadsheet. Market conditions shift, user adoption doesn't match predictions, or complexities arise. The plan, though well-intentioned, was built on assumptions rather than validated evidence.
There is a better way.
Annual planning should not be about capacity initially. It should be about defining in-scope activities that are part of a clear roadmap towards a defined goal, and then deploying the required resources.
As co-authors, we represent the typical tension in any planning room. One of us leverages his past experience at McKinsey and his current role at the health-tech company KHealth to deliver actionable insights on effective strategy and planning processes. The other is a product operator at Adobe, living in the trenches of user friction and technical debt. We used to speak different languages—'EBITDA' versus 'User Journey.' Our goal is to provide a practical method for translating high-level business ambiguity into concrete product action, preventing the friction that often arises between strategy and execution.
To demonstrate this framework in action, we will use a hypothetical fitness app called "Stride" as our running example.
Imagine being part of the product team building the product budget of Stride. This article shows how a hypothetical "Social Engagement Squad" avoids the "Excel Roadmap" and builds a Scientific Roadmap - where every item has a "Why" (Metric), a "Because" (Validation), and a "What" (Visual).
Phase 1: Defining the North Star
The first point of failure is the disconnect between business needs and what product teams influence.
Executives speak Output Metrics (Revenue, EBITDA). If the Stride CEO walks into planning, their ask is likely: "Grow revenue from Premium by 20%."
The trap is accepting this Output Metric as a product goal. You cannot code a credit card transaction; you can only code features that make people want to subscribe. Subscriptions are lagging indicators - results of value, not value itself.
Product teams operate on Input Metrics - user behaviors that generate value. To bridge this gap, you need a Proxy Metric.
The Proxy Metric
Before planning kicks off, Strategy & Operations meets Product to run a correlation analysis. They must prove that moving a product lever drives the financial lift.
Let’s say the team analyzes two years of data and finds a strong correlation: Free users who record >3 activities a week are 4x more likely to convert to Premium.
The conversation shifts from "fixing revenue" to a scientific pledge:
"We cannot directly control Premium buys. However, we can define the proxy metric - Weekly Active Users (WAU) - as the number of users who record >3 activities a week. Therefore, our Annual North Star is to Increase WAU by 15%. If we hit this Input Metric, our model projects the 20% Revenue lift you need."
This is the anchor. By targeting a user behavior (running more) rather than a financial outcome, the team gains the freedom to explore creative solutions. The question now becomes: "How do we get people off the couch?".
The metric definition workshop
In our experience, defining this proxy metric is often the hardest part of the year. We recall early career planning cycles where teams accepted generic goals like "Improve Retention", only to struggle for months because the goal was too broad.
Actionable insight: Start by listing the drivers of executive output metrics and analyzing their historical trends. Run dedicated "Metric Workshops" to map user behaviors directly to these drivers. If you cannot prove that a specific user action leads to the business outcome, pause the planning. It is better to address the data gap now than to build a strategy on assumptions.
Phase 2: Hypothesizing
With the goal set, avoid the temptation to jump straight to solutions ("Let's build leaderboards!"). This is "Feature Factory" thinking.
To build a resilient strategy, we use a Strategy Kernel framework (adapted from Richard Rumelt). This forces you to unpack the logic of your bet before defining the feature. It consists of three parts: Diagnosis, Guiding Policy, and Coherent Action.
Here is how the Stride squad uses it:
1. The Diagnosis (The Problem): Why are users not running three times a week? Research shows casual runners lack internal willpower and feel lonely running solo.
- Diagnosis: "Users suffer from a lack of accountability and feel lonely running solo."
2. The Guiding Policy (The Approach): Building better "willpower" features (for example by building louder or more frequent notifications) feels weak. The team pivots.
- Guiding Policy: "Creating and fostering 'external social pressure' by leveraging the friend graph is a more actionable lever than relying on 'internal willpower'."
3. The Coherent Action (The Feature): Only now do we discuss features.
- Coherent action: "Build 'Squad Challenges' - a synchronous feature where friends race against a collective goal, creating mutual accountability."
If the team had just said "Let's build Challenges," they might have built a solo version that failed to solve the "Loneliness" diagnosis. The Kernel framework ensures the feature is rooted in a psychological hypothesis.
Slow down to speed up
An effective strategy relies on clearly identifying the root cause of a challenge before attempting to solve it. While this diagnosis takes time, being meticulous in this phase pays dividends for the rest of the year.
Actionable insight: Implement a "Diagnosis First" rule. Applying the "5 Whys" framework helps trace the problem back to its root cause, ensuring the solution logically aligns with the team's findings. This prevents treating symptoms rather than the core issue. Think of the written diagnosis as a necessary filter: if the team cannot agree on the "illness," it is too early to prescribe the "medication."
Phase 3: Validation
Most teams would now estimate "Squad Challenges" at six months of work and slap it on the roadmap. This is a mistake.
You have a hypothesis (Social Pressure drives frequency), but no proof. Committing significant engineering resources to an unproven idea is risky. To mitigate this, we introduce a gate to verify our assumption. This is something pre-emptive compared to the launch of an MVP, as the need is to provide practical evidence for our assumptions before adding initiatives to the roadmap.
The painted door test
Validation can take many forms, from customer interviews and surveys to low-fidelity prototype testing. For a high-stakes bet like this, the squad chooses a high-signal method: the "Painted Door" test (or fake door test).
During a two-week sprint, they place a card on the Stride home screen in addition to the existing card of solo time trial: "New! Start a Squad Challenge with Friends."
Clicks trigger a polite modal: "Thanks for your interest! We’re building this now. Join the waitlist."
The results:
- Baseline "Solo Time Trial" button CTR: 2%.
- New "Squad Challenge" button CTR: 12%.
This is quantified demand. The initiative moves from a guess to a validated bet, earning its place on the roadmap.
Mitigating execution risk
We have encountered situations where significant initiatives launched in Q1 didn't yield the expected results by mid-year. In one instance, a team dedicated months to building an integration based on intuition, only to see low initial adoption.
Actionable insight: View validation as a risk management tool. Identify a validation method that can give you enough confidence within a few weeks to commit to a one-year strategy. Whether it’s a painted door test, a client focus group, or an MVP, gathering data to support your hypothesis helps avoid draining resources on long builds that may lack potential.
Phase 4: Visualization & the vision
We have a Metric, a Strategy, and Evidence. But we still need Stakeholder Alignment. Executives don't read PRDs or Jira tickets. Stakeholders buy visions.
To win the resource war, you need to show them what the future looks like. You need a Vision Prototype. It serves as a North Star for the user experience, meant to evoke emotion and direction.
The AI accelerator
Creating high-fidelity video prototypes used to take weeks. Today, the Stride PM leverages Generative AI (like Nano Banana) to rapidly generate a vision video.
The squad creates a 30-second video showing. It looks visceral and captures the feeling of social pressure. When the VP of Product sees this, the conversation shifts from debating the resource allocation to discussing the strategic value of the experience.
Moving beyond text-heavy PRDs
Early in our careers, we spent days writing lengthy Product Requirement Documents (PRDs), often finding that stakeholders struggled to absorb the nuances buried in the text.
Actionable insight: Supplement written requirements with visual storytelling. Use storyboards, short walkthrough videos, or AI-generated concepts to show the desired user experience. Avoid overengineering the visual storytelling. Instead, highlight the few key elements of the desired user experience that resonate at an executive level. This shifts the conversation toward the value for the user, rather than just costs or timelines.
Phase 5: Initiatives & alignment
We arrive at the planning meeting: "The Handshake" between Strategy and Execution.
The squad walks in not with a wishlist, but a business case:
"Our goal is to move Weekly Active Athletes by 15%.
Our diagnosis identifies Loneliness as the barrier and we hypothesize Social Pressure is the fix.
We validated this with a two week Painted Door test showing 12% demand for group features.
We propose building Squad Challenges. Here is the Concept Video of the experience.
We estimate this initiative requires six sprint cycles to deliver."
This connects the dots, creating a compelling case that is far harder to dismiss than a simple feature request.
Defining success (milestones)
To maintain trust, the squad breaks delivery into milestones, starting with a quarter-by-quarter view that will then need to be thoroughly planned by stakeholder teams:
- Q1: MVP (Validation). Launch basic "Squad Goals" (simple text-only progress). Goal: Validate retention lift.
- Q2: Beta (Adoption). Launch "Visual Track" (simple avatars on a 2D map). Goal: Drive viral adoption as users invite friends to race.
- Q3: Scale (Growth). Open to all geographies and social graphs.
- Q4: Monetization (Revenue). Gate advanced features behind Premium to capture the CEO's revenue goal.
The communication plan
Finally, the PM launches a "Strategy Roadshow," sharing the concept video with Engineering to align them on the "Why" (solving loneliness) and with Marketing to seed the campaign narrative. This ensures that everyone is aligned on the end target and the reason why.
The roadmap is a contract, not a list
The biggest friction point we see between Strategy and Product is when a roadmap is treated as a rigid "feature list." If the Q1 MVP fails to drive retention, the plan must change.
Actionable insight: Frame your roadmap presentation as a commitment to outcomes, not just output. When you present, say: "We are committing to solve the loneliness problem of solo runners. Our first bet is Squad Goals. If the Q1 data prove it, we will scale the solution; otherwise, we will pivot to the next hypothesis." This builds trust with leadership that you are focusing on the outcome, not just managing the backlog.
Conclusion
The "Excel Roadmap" offers an illusion of control. Building a Scientific Roadmap simply requires more upfront rigor—validating metrics, strategies, and demand before committing resources.
Our hypothetical Stride team didn't just ask for resources; they proved the value of their ideas before writing code. They moved from guess-planning to informed and pre-tested planning.
This year, aim to validate your roadmap before you commit to it.