Developing a feature health dashboard at Sofatutor
Project scope
Team: Product manager, 2 product designers, 1 product analyst, developers, 1 user researcher (me)
Timeline: 3 weeks
Company: Sofatutor - a digital learning platform for students
Methods: Workshop, Amplitude analytics​​
The challenge
Product teams were using scattered dashboards, one-off charts, and ad-hoc analyses, but there was no reliable and shared view of how features were performing.
​
The goal was to create a centralized source of truth and build a shared process around it that would inform decisions across the product org.​
My role
I led the internal research and facilitation work by:
-
Aligning stakeholders needs
-
Creating and iterating the dashboard based on stakeholder needs
-
Embedding it into decision-making processes
-
Driving team adoption and ongoing maintenance
Impact
​The centralized dashboard became the most viewed in the product org and is now integrated into recurring sessions, including roadmap planning and monthly KPI reviews.
It helped surface insights such as identifying a drop in conversion rate that triggered follow-up research.
The need for a single source of truth
When I joined Sofatutor, multiple unstructured dashboards and charts were scattered across teams, with no clear ownership or maintenance. There was no centralized view of feature health, user engagement, or adoption trends, making it difficult to track product performance effectively.
We needed a single source of truth that would:
-
Track key metrics and feature releases over time
-
Surface drops in metrics and opportunities
-
Support strategy and data-driven decision-making
The process
1
Aligning stakeholder needs
2
Prototyping and iteration
3
Embedding it into workflows
4
Driving team adoption
1. Aligning stakeholder needs
To ensure the dashboard met the needs of different teams, I designed a collaborative workshop in Miro, bringing together product managers, designers, engineers, and data analysts.​
​
The goals were to:
-
Uncover how different teams define “feature health”
-
Prioritize critical metrics and KPIs
-
Align on use cases and workflows for the dashboard​
​​
Workshop activities included:
-
Individual reflection & sharing: Participants wrote down how they’d use the dashboard and what decisions it should support. This helped surface use cases from different perspectives.
-
Collection of key metrics, features & priorization: Participants collected and then voted on the most important KPIs, ensuring we avoided dashboard overload down the road.
-
Sketching the "Dream Dashboard": Rather than jumping into analytics, participants sketched their ideal dashboard on paper. This encouraged open thinking and gave me a clearer picture of what “useful” looked like to each person.
-
Defining possible integration into workflows: We ended with a discussion about how the dashboard would fit into existing workflows, like KPI reviews or strategy planning, to ensure practical application of the dashboard.


2. Prototyping and iteration
​
Using insights from the workshop, I created an initial dashboard prototype in Amplitude, focusing on:
-
User activity & behavior (e.g., session frequency, duration, monthly active users, etc.)
-
Feature adoption & engagement (e.g., most/least used features)
-
Conversion funnel metrics (e.g., trial-to-paid, completion rates of learning activities)
-
Impact of new feature releases on product performance, by using the feature release function in Amplitude
​
​The first version was tested with stakeholders, and based on their feedback, I made refinements such as adding more guidance on how to interpret certain charts and adding more descriptions to ensure all stakeholders made valid interpretations.
3. Embedding it into workflows​
​
Once finalized, I worked on guidelines with the team to make the dashboard stick, for example:
1. Daily & Ad-hoc Use:
-
Real-time alerts when key metrics drop (shared on Slack).
-
Feature release tracking to measure new impact.
2. Regular Check-ins & Reporting:
-
Monthly KPI reviews within our already existing KPI meetings, where I'd present insights, if there are any.
-
Research recaps integrating dashboard findings into UX reports to inform roadmap decisions.
4. Driving team adoption
​​
To drive adoption of the dashboard within the team, I introduced the dashboard in a fun, interactive session and included:
-
A quick walkthrough of the tool and the dashboard, walking the team through key metrics of our current features
-
Quizzes like "What’s the most used learning feature at Sofatutor?"
-
Hands-on exploration to help teams navigate the tool confidently.​
​
The goal of the session was to give the team an overview of how our features are overall performing, and how to navigate and interpret the dashboard without my guidance.
The outcome
-
A go-to tool for strategic planning: The dashboard is now a core part of our quarterly research recaps, helping product managers prep for strategy meetings.
-
Spotting issues & tracking wins: We can now quickly detect problems or track successes. For example, when completion rates for a learning feature dropped, it kicked off a research investigation. On the flip side, when we improved content for the dictation feature, we could track how student success rates improved.
-
One shared source of truth: Product, design, data, and engineering finally have one place to go for key metrics.
Key learnings and reflections
-
Alignment from the start is key: Getting everyone’s perspective early on made all the difference. It ensured the dashboard reflected real needs and had buy-in from all teams.
-
Implementation is just half the work: The real challenge is driving adoption. Repeating the existence of the dashboard, reinforcing usage, and making the dashboard part of ongoing conversations helped it stick.
-
Don't forget about maintenance: I’ve learned to set aside time each month to review the dashboard, ensure it’s up to date, and adjust my planning accordingly. A dashboard is only as good as how well it’s maintained.