Here are a few key takeaways across themes that emerged. Each speaks to the power of grounding AI innovation in what we already know works: centering educators, prioritizing student learning, and leading with equity.
It can be easy to get swept up in new tech tools, especially when AI is being so heavily marketed and widespread in education. Dr. Punya Mishra reminds us to slow down and start from our values as educators. “What we do in schools, in classrooms, is something different than just a gee-whiz technology. It is engaging with learners, getting where they are coming from, creating context within which they can learn and thrive,” he said in our “AI Integration and Technological Pedagogical Content Knowledge (TPACK)” AMA session.
That learning-first approach is already showing up in classrooms. Jaime Donally, a former math teacher and founder of #ARVRinEDU, shared how teachers are using AI tools for extending lessons and overall planning support. What stood out most was the focus on access and relevant professional learning. Addressing these needs can help support teachers as they try new things, which also opens the door to learning alongside students and sharing knowledge with other peers on what works, and what doesn’t.
At the same time, building AI literacy alongside this experimentation is critical for protecting student data and ensuring tools support powerful learning. During the “AI Literacy Support” AMA, Megan Pattenhouse, project director of emerging tech implementation at Digital Promise, explained, “AI literacy is building practical skills so students can use, understand, and evaluate AI-enabled tools.” And it’s not a brand new starting point: she reminded us that these skills are grounded in things educators already teach—computational thinking, media literacy, and critical analysis. It’s not about having all the answers. It’s about equipping ourselves and our students with the skills to ask better questions.
As schools continue to integrate various AI tools, it’s not just a question of what a tool can do, but it’s equally important to evaluate and decide what it should do. During the “AI Research, Policy, and Ethics” AMA, Dr. Sarah Burriss, emphasized that responsible adoption must go beyond functionality: “From privacy to environmental impact to alignment with pedagogical goals, there are many things to consider.” Resources like EngageAI’s AI Bill of Rights for Educators and forthcoming AI Compass tool help schools ask the hard but necessary questions about transparency, student data, and long-term impact.
Trust is key. Burriss underscored the importance of listening to and involving teachers, students, and families early and often. If we want AI to serve learning, we have to ensure the systems we adopt reflect the values of the communities we serve. That includes understanding when a tool doesn’t align, and feeling empowered to say no.
Cybersecurity expert, Dr. Timothy Summers, took that one step further in the “AI and Cybersecurity in Education” AMA session. He outlined a range of AI-related threats that many educators might not even be aware of yet, from model inference attacks to deepfake misuse. Even AI-powered monitoring tools can unintentionally leak sensitive student data. Summers offered clear, practical steps schools can take to protect their communities, including strong data policies, using only the student data needed, and teaching digital safety as part of everyday learning. Safety and equity go hand in hand; if students can’t trust the systems around them, they can’t take risks or thrive within them.
While much of the energy around AI is happening in the classroom, real progress depends on alignment at every level—from district tech leads to state policy teams. Educators shouldn’t have to carry the weight of AI integration alone.
Jean-Claude Brizard, president and CEO of Digital Promise, echoed this systems-level view in the “Organizational, District, and Policy Views on AI in Education” AMA: “AI for the most part, is going to show up in edtech tools in classrooms and in schools, and most districts do not have a cogent edtech strategy…at the state and district level…please make sure your Chief Academic Officer, your Chief Information Officer, or Chief Technology Officer, are in the same room and planning together. When you have that coordinated, coherent effort on an instructional program and the technology as a tool to accelerate and support, then you have a really cogent strategy and the AI makes sense in supporting [all learners]. All of it shows up in a way that is coherent.”
Systems-level decisions have to be in step with what’s actually happening in classrooms. Across every AMA, educators called out the need for time to explore, support to grow, and meaningful ways to make their voices heard, especially when it comes to choosing and shaping the tools they’re expected to use.
Continue your learning with these related Digital Promise resources: