Transforming K-12 Education with AI: A New Report with Insights from 28 Exploratory Projects – Digital Promise

Transforming K-12 Education with AI: A New Report with Insights from 28 Exploratory Projects

October 23, 2024 | By

Given the rapid advances in AI and the momentum in the education field to understand how these technologies can support teaching and learning, last year the Gates Foundation launched a pilot initiative to provide funding to test new AI ideas that are in support of equitable K-12 mathematics outcomes. This is the final in a series of five blog posts elevating key learnings from this set of investments. Today’s blog announces the publication of the report, which highlights the cohort’s work. Check the last blog post.

How can schools and districts use AI in ethical and equitable ways? What support do they need to navigate their AI journeys? Is it even possible to develop strategies that keep pace with this constantly evolving technology? AI is transforming K-12 education, and with this rapid change comes a wave of unknowns, anxieties, and questions.

Digital Promise’s new report, “An Ethical and Equitable Vision of AI in Education: Learning Across 28 Exploratory Projects,” shares how a group of exploratory projects led by district, edtech developer, and researcher teams from across the country have been working to shape an equitable future for AI in schools. With support from the Bill & Melinda Gates Foundation, this group tested ways to leverage AI toward equitable outcomes for students who are Black, Latino/e, and from low-income backgrounds. The cohort teams showcased the diverse ways education organizations are applying AI to tackle challenges and support the growth of students, educators, and communities. These teams engaged in a wide range of activities: developing and studying AI tools and implementation; exploring qualitative data analysis; and personalizing student-facing materials and instructional supports.

By highlighting early successes, key barriers, and sector-specific actions, the new report offers a look into the possibilities and challenges of implementing AI for teaching, learning, and social connection in K-12 education.

Collaboration and Co-Design

Designing in partnership with education leaders, teachers, and students is crucial for building AI tools that are relevant, ethical, and equitable.

K-12 AI Pilot Cohort participants came away agreeing that developers should prioritize human-centered design by actively involving educators, students, and communities in the development process. Working in this way can ensure that AI tools are not only innovative but also genuinely meet the needs of the users they are designed to support. Cohort teams found that intentional collaboration led to key understandings about contextual requirements for successful implementation, as well as the development of more useful, culturally responsive tools that address real needs.

For example, Teaching Lab’s pilot project leveraged co-design with educators to produce a variety of curriculum-aligned, customizable tools. Using their research and development platform, Teachinglab.ai, Teaching Lab tested math-based tools with dozens of educators. Based on feedback and contributions from educators, they developed two different products: a browser extension that teachers can use to customize math curricula to suit their specific students’ context and needs, and a product that embeds modified curricula directly into a Google Doc for teachers to use. Teaching Lab will continue to test both products in their math coaching partnerships this year, keeping the co-design going with an eye toward culturally responsive products.

Trust and AI Literacy

While AI offers promising ways to use data to support decision-making, educators must trust and understand the tools and programs to feel safe using them.

Many educators feel they lack a baseline understanding of AI tools to make sense of how they work, let alone trust that using them will be effective and equitable for their students. Cohort members were looking for more transparency, particularly around diversity in AI development. They wonder: who builds and trains the AI, what data is used, and who is represented?

For example, Amplify’s project used generative AI technology to support teachers in providing asset-based feedback on students’ math work, a research-based method linked to improved outcomes for historically and systematically excluded students. The AI tool offered feedback on teachers’ feedback, guiding them toward more positive framing for students. While about half of participating teachers felt this tool improved their feedback, many wanted greater transparency about how it worked, particularly when they were receiving constant critiques on their feedback. This underscores the critical need for transparency and trust between educators and edtech developers, especially in addressing diversity and equity concerns, to ensure AI tools are effective and supportive in educational settings.

Support and Professional Learning

Educators need comprehensive support and ongoing professional learning to effectively integrate AI tools in schools and classrooms.

Teachers can benefit from professional learning focused not only on how to use AI but also on understanding its role in enhancing, rather than replacing, their work. To build trust and address these concerns, districts can develop ethical AI use policies, provide clear communication about data privacy, costs, and access to AI tools, and educate staff on the benefits and risks of AI, especially regarding data security. Equitable AI literacy is also important to prevent an AI access gap in under-resourced schools. Many districts currently lack policies and struggle to create sustainable guidelines for evaluating AI, which raises concerns that some may avoid AI altogether. To ensure AI’s equitable and effective use, districts, schools, and educators require targeted support in navigating these challenges.

For example, KIPP Chicago is serving a higher percentage of newcomer students than ever before. In response to teachers’ concern about better supporting this population, KIPP Chicago is working with industry partners to design GenAI-powered tools that offer educators improved ways of communicating with non-native English-speaking families and students. To launch this work, KIPP Chicago held AI literacy learning sessions for educators and community members to meet with AI industry partners to learn more about how the technology works and to address concerns about AI’s impact on their classrooms, careers, and the field of education as a whole. These sessions created an ongoing, open dialogue about effective integration of AI in school, which led to increased confidence for many participating educators. Several teachers felt a strong enough level of expertise to present at The AIRSHOWS @ ASU+GSV to support the future role of AI in education.

While the 28 projects ranged in scope and findings, participants agreed that teachers and students must be at the heart of AI development in education to ensure these tools meet their needs. This requires a shift in mindset — from seeing AI as a standalone solution to embedding it within the real-life contexts of classrooms and communities — that can only be achieved through ongoing collaboration between developers and educators. Read the full report, “An Ethical and Equitable Vision of AI in Education,” for a more in-depth exploration of how to design AI tools that are accessible, ethical, and beneficial for all learners.

Want to know more about AI in education and the work of the K-12 AI Pilot Cohort? Explore the first four blogs in this series here:

Sign Up For Updates! Email icon

Sign up for updates!

×