Picture two eighth graders taking the same math test. Both get question 12 wrong. But one student spent 45 seconds, never used the digital notepad (scratchwork) tool, and moved on. The other spent three minutes, used scratchwork extensively—writing and erasing multiple times—and returned to the question twice.
Traditional scoring treats these students identically—both missed the question. But their approaches tell vastly different stories about what they need to succeed.
This is the promise of process data: the digital record of every click, pause, and tool use students make during computer-based learning and assessments. New research suggests these digital breadcrumbs can help us understand persistent achievement gaps.
In a study published in Psychometrika, researchers Sunbeom Kwon and Susu Zhang analyzed thousands of student records from the 2017 National Assessment of Educational Progress (NAEP) Grade 8 Math assessment. They found that differences in students’ test-taking behaviors accounted for a statistically significant portion of the achievement gap between students with learning disabilities and their peers. These differing behaviors include:
These findings matter because strategies and effective tool use can be taught and learned. When students learn to approach problems more efficiently—such as by choosing the right tools, catching careless errors, and managing their time strategically—they don’t just perform better on tests; they also develop transferable skills that support learning across subjects and contexts.
Below, we share practical recommendations for teachers, district leaders, and policymakers to incorporate strategy instruction and its data infrastructure in schools.
Achievement gaps have persisted despite countless interventions targeting content knowledge alone. This research offers a complementary lens: gaps aren’t just about what students know, but are also about how students approach problems.
The encouraging news is that strategic behaviors are malleable. A student who rushes can learn to plan. A student who never uses scratchwork can be taught when it helps. A student who gives up quickly can develop persistence strategies.
Process data makes the invisible visible, which only requires learning from data that already exists but is currently discarded.
Prioritize transparency. Publish plain-language family guides explaining what process data is collected, how it’s stored, and how it improves learning. Commit explicitly that it will never be used for high-stakes decisions.
Teachers: After your next quiz, have students reflect for five minutes: “What was your strategy? What would you try differently?” Use their insights to guide discussion about effective approaches.
District leaders: Ask your assessment vendor: “What process data do we collect? Can we access it?” You may be surprised by what’s already available.
Policymakers: Review state assessment contracts. Identify where you could require process data reporting with strong privacy guardrails.
Read the full study on Cambridge University Press.