From Compliance to Clarity: How to Make Data Work for You
- lindsayleighwood
- 4 days ago
- 4 min read
By Lindsay Wood, Founder and Consultant at RootED Analytics

If your team spends more time collecting data than using it, you’re not alone. Across education entities, it’s common to see dashboards overflowing with metrics, spreadsheets updated solely for reporting, and meetings where “reviewing the data” feels more like checking a box than learning something new.
It’s not that people don’t care — it’s that our systems weren’t designed for reflection. Over time, the push for accountability and compliance has crowded out the deeper purpose of data: to guide thoughtful improvement. The result? Educators are swimming in numbers but starved for insight.
National polling shows that while most teachers (95%) use multiple kinds of data to understand their students, over half (57%) say they don’t have enough time during the school day to actually work with that information, and over a third (34%) believe there is simply too much data to wade through (Data Quality Campaign, 2018). That imbalance is more than just a workload problem — it’s a clarity problem.
So how do we make data work for us again?
From Reporting to Reflection
Most data systems in education were built to meet reporting requirements, not to inspire learning. Reports go up; spreadsheets get filed away; another cycle begins. The intention is accountability, but the outcome is often exhaustion.
The real issue isn’t that we collect data — it’s that we collect it without a clear purpose. We’re answering questions no one is asking, while the questions that do matter go unmeasured.
Assessment expert Dylan Wiliam calls this the shift from data-driven decision making to decision-driven data collection:
“People who espouse data-driven decision making tend to focus on the data. They collect data hoping that they might come in useful at some point in the future. Those who focus on decision-driven data collection decide what they want to do with the data before they collect the data, and collect only the data they need. That way, they always know what to do with the data.”
(Embedded Formative Assessment, 2011, p. 47).
This shift — from collecting for collection’s sake to collecting with a purpose — is what turns data into a catalyst.
A Real Example: Less Data, More Direction
One department I partnered with tracked more than 30 indicators for their quarterly data meetings. On paper, it looked thorough. In practice, it was paralyzing. These indicators weren’t tied to specific goals or outcomes — they were simply easy to track.
This felt fine until it was time to discuss the data. Conversations lacked direction, patterns were unclear, and by the end of each meeting, there were no actionable takeaways.
When we redesigned their system, we narrowed the list to eight indicators — those directly tied to the department’s strategic goals. Within months, data discussions shifted from passive updates to meaningful dialogue. Educators began asking questions such as:
“What patterns do we notice between excused and unexcused absences? Do these patterns differ between grade levels, socioeconomic status, race, gender, or disability?”
“After we implemented the attendance campaign, what changed? What stayed the same?”
“How does educator attendance compare to student attendance? Do we notice any relationships or patterns?”
By simplifying the system, we didn’t lose information — we gained focus.
This approach aligns with what the Carnegie Foundation for the Advancement of Teaching calls disciplined inquiry: identifying a small set of measures tied directly to improvement aims and using them iteratively to learn, not just to prove (Sherer et al., 2020).
Why Clarity Matters
When teams have too much data and not enough structure, interpretation becomes guesswork. Numbers get presented but not discussed. Insights stay buried.
But when data is organized around the questions that matter:
Focus replaces volume.
Interpretation is shared, not siloed.
Reflection becomes routine.
Data conversations feel safe, not punitive.
The Harvard Data Wise Project calls this an inquiry stance — a culture where data isn’t used to judge, but to understand (Boudett, City, & Murnane, 2013). That’s when reflection begins to stick.
Making It Practical: How to Start Right Now
If you’re ready to move from compliance to clarity, here are five ways to start:
Ask before you collect. Every data request should begin with: “What decision will this help us make?”
Keep it simple. Limit each goal to three to five indicators that reflect both progress and process.
Build reflection into your existing structures. Reserve agenda time for discussion, interpretation, and “what’s next.”
Pair numbers with stories. Complement quantitative trends with qualitative insights from students, staff, or families.
Audit your data clutter. Review one legacy report or spreadsheet and ask: “Who actually uses this — and how?”
These are small shifts, but they add up — especially when leadership models that clarity is more valuable than quantity.
Closing the Gap Between Data and Decisions
The truth is, data doesn’t improve anything on its own. People do — when they have time, structure, and clarity.
When we shift from compliance to clarity, we stop treating data as a burden and start treating it as a bridge: between questions and answers, between leaders and classrooms, between intention and impact.
At RootED Analytics, that’s what our work is all about — helping education leaders design systems that are rooted in evidence, grounded in practice, and built for continuous improvement.
Because data should feel helpful, not heavy.
References
Boudett, K. P., City, E. A., & Murnane, R. J. (2013). Data Wise: A Step-by-Step Guide to Using Assessment Results to Improve Teaching and Learning. Harvard Education Press.
Data Quality Campaign. (2018, September 12). What parents and teachers think about education data. Retrieved from https://dataqualitycampaign.org/resource/what- parents-and-teachers-think-about-education-data/
Sherer, D., Norman, J., Bryk, A. S., Peurach, D. J., Vasudeva, A., & McMahon, K. (2020). Evidence for Improvement: An Integrated Analytic Approach for Supporting Networks in Education. Stanford, CA: Carnegie Foundation for the Advancement of Teaching. Retrieved from https://www.carnegiefoundation.org/wp- content/uploads/2020/02/Carnegie_EFI_Report_2020.pdf
Wiliam, D. (2011). Embedded Formative Assessment. Solution Tree Press.


Comments