A key part of this process is understanding how students view and experience AI. Empathy interviews allow educational leaders, practitioners, and researchers to hear directly from learners about their experiences, providing valuable student perspectives. The learnings from empathy interviews can inform how districts approach AI guidelines.
Mary Catherine Reljac, superintendent of Fox Chapel Area School District in Pittsburgh, Pennsylvania, first conducted empathy interviews in 2024. Her goal was to understand how students in her district felt about artificial intelligence and consider how these perspectives might inform her district’s responsible use guidelines.
When Reljac and her team first asked students about their current use of AI in 2024—including how they envision using AI, where they think the district should establish guardrails, and what leaders can do to enhance understanding—they came across some illuminating findings:
One year later, in 2025, Reljac decided to conduct the interviews again with both middle school and high school students in her district. When compared to the previous year’s results, these conversations revealed three takeaways about how students are perceiving and interacting with AI in school.
In both sets of interviews, students reported extremely high usage rates of AI tools, with 80%-90% of peers having used AI at least once. Students reported that they commonly use AI tools like ChatGPT, Grammarly, and Google Gemini for study guides, essay checking, math problems, and research assistance.
A central tension exists between using AI as a legitimate learning aid and academic dishonesty. Students acknowledge that many peers use AI for cheating, but they also recognize the technology’s potential as a valuable educational tool—when used appropriately.
Students recommended that educators take a more proactive and structured approach to addressing cheating with AI by establishing clear communication and boundaries. They suggested having explicit conversations about when AI use is appropriate or inappropriate for specific assignments, rather than leaving students to guess the rules.
The students also shared concerns and hesitations about sharing private information with AI tools; in particular, high school students shared their belief that younger students need better education about what AI is before using it extensively. Overall, students expressed a desire for more education about privacy and responsible use of AI.
The 2025 empathy interviews revealed that students want schools to teach proper usage of AI and integrate it meaningfully into curriculum. Students demonstrate sophisticated awareness of AI’s shortcomings; for example, they noted that generative AI tools often provide wrong answers for math and science problems, may not align with their personal notes for study guides, can generate inaccurate sources, and produce generic writing that doesn’t match their voice.
But students want to learn how to leverage AI tools in their daily lives. They understand that their futures will depend on their ability to use this technology responsibly and effectively, both in and out of school. For example, one student said, “I like to use [AI] to show me a strategy to use for later problems, then try it on my own, then double check.” Others shared that they use these tools for their own creative projects outside of school.
Reflecting on the 2025 empathy interviews, Reljac said, “Our students expressed strong opinions about AI that are developing over time. They provided candid feedback on the district’s current approaches and asked for more AI literacy—something that we are continuing to focus on.” As Fox Chapel Area School District continues to shape its AI policies, the students’ perspectives will play a central role. “Their voices will significantly shape the work for staff and students in the upcoming school year.”