How Higher Ed Can Make AI Work – Digital Promise

How Higher Ed Can Make AI Work

Group of Black students studying individually at a table at a college library.

May 7, 2026 | By and

  • Rather than merely providing access to AI tools, institutions must build systems that translate access into meaningful learning.
  • Successful AI integration depends on strong governance, clear guidance, and aligned professional learning, not just individual innovation.
  • Moving from isolated efforts to equitable, scalable AI implementation requires coordinated, cross-disciplinary institutional design.
Across higher education, there is no shortage of motivation to integrate AI. Institutions are investing in tools, encouraging experimentation, and signaling that innovation matters. But “Institutional Conditions for Integrating Emerging Technologies in Higher Education” makes clear, good intentions alone are not enough. What determines whether AI becomes transformative or remains scattered across isolated classrooms is not the technology itself. It is the infrastructure around it.

Too often, institutions are expanding access without building the systems needed to support meaningful, equitable, and sustainable use. The result is a growing gap between what institutions hope to achieve and what is actually happening in practice.

Grounded in Real Institutional Experiences

We conducted focus groups and interviews with faculty, administrators, IT leaders, and students, alongside a national survey with 120 respondents from institutions of higher education across the U.S. This work is rooted in the lived experiences of the people doing the work. Rather than proposing another framework, our recent report examines how institutional conditions shape what actually happens on the ground. We provide five findings from the study with corresponding recommendations to surface where systems are breaking down and where they can be strengthened to support coordinated, scalable AI integration.

Finding 1: Innovation is happening, but it depends on individuals

Across institutions, AI adoption is largely driven by individual faculty initiative rather than institutional strategy. Faculty described having the autonomy to experiment, but this often leads to a haphazard approach across campus. Less than 30% of respondents agreed that technology decisions and policies are clearly communicated across their institution.

  • What this means: Innovation is fragile. It depends on who is motivated, who has time, and who is willing to take risks.
  • Recommendation: Establish permanent, inclusive governance structures that include faculty, IT, and instructional support. These groups should have real authority and clear, documented feedback loops so decisions reflect classroom realities.
Finding 2: Access has expanded faster than guidance

Institutions have moved quickly to provide AI tools. Over 70% of survey respondents reported that students have access to paid AI tools. But policies, learning goals, and instructional guidance have not kept pace. Faculty and students are navigating mixed messages. Without shared institutional direction, one instructor may encourage using AI for brainstorming while another prohibits it entirely, leaving students confused about expectations.

  • What this means: Access without guidance leads to confusion, inconsistent student experiences, and no shared vision of what students should actually learn about AI.
  • Recommendation: Develop institution-wide guidance for AI use and define shared learning outcomes across disciplines. Students should not have to guess what responsible AI use looks like.
Finding 3: Professional learning exists, but it doesn’t fit faculty realities

Professional development on AI is widely available, but it is often misaligned with faculty time, workload, and incentives. Alarmingly, 60% of faculty reported they are not given compensated time to integrate new technologies. Workshops are often too long, resulting in “cognitive overload,” or are disconnected from classroom needs. Instead, what works are shorter, applied sessions.

  • What this means: Even strong professional learning opportunities go unused if they don’t reflect the reality of faculty work.
  • Recommendation: Redesign professional learning to be short, applied, and aligned with faculty schedules. Provide incentives such as stipends, recognition, or course release time. Embed instructional support staff early in planning, not just implementation.
Finding 4. Workforce preparation is uneven across disciplines

Nearly all institutions recognize the importance of preparing students for an AI-driven workforce. But implementation is uneven. AI learning opportunities are often concentrated in STEM programs, while students in other fields have limited exposure. As one administrator noted, “STEM students are getting all the love… but no one is really looking at scale at the non-STEM students”. Consequently, about half of faculty reported lacking resources to understand how AI connects to workforce applications.

  • What this means: Student’s AI readiness often depends on their major, not institutional design.
  • Recommendation: Coordinate AI efforts across disciplines. Establish shared goals and create opportunities for all students to engage with AI through coursework, interdisciplinary projects, internships, and real-world applications.
Finding 5. Equity is a priority, but capacity is the barrier

Infrastructure, funding, and staffing limitations make equitable implementation difficult. Institutions are piloting promising initiatives, often with grant funding, but many struggle to scale them without sustained resources once the funding ends. Faculty even reported abandoning useful tools if they weren’t centrally provided, knowing that only students who could afford private subscriptions would benefit.

  • What this means: Equity remains an aspiration without the systems to support it.
  • Recommendation: Pair access to AI tools with targeted investments in infrastructure, staffing, and support. Equity requires resourcing, not just intention.

Moving Forward: Build the System, Not Just the Tools

Across all five areas, a clear pattern emerges. Institutions are moving quickly, but not always coherently. AI integration is not just a technology challenge. It is a systems challenge. When governance includes the right voices, when guidance is clear, when professional learning fits faculty work, when efforts are coordinated across disciplines, and when equity is resourced, AI can move from isolated innovation to institutional transformation.Report cover for Institutional Conditions for Integrating Emerging Technologies in Higher Education for May 2026. There are logos from Digital Promise and AAC&U in the bottom left corner along with a report cover image of a group of black students studying individually on laptops at a table. Higher education is at a critical moment. The question is no longer whether to adopt AI. It is whether institutions will build the systems needed to do so thoughtfully, equitably, and at scale.

Want to know more about AI and Higher Education?

Sign Up For Updates! Email
Loading...