Last winter, months before OpenAI’s announcement of its new generative AI platform ChatGPT Edu, a handful of universities were previewing the possibilities.

At Arizona State University, a German professor started building a bot that evaluates students’ language proficiency and offers personalized feedback. At the University of Pennsylvania’s Wharton School, MBA students reported thinking more deeply about the material after “discussing” their final assignment with the platform.

And at Columbia University, researchers fed ChatGPT Edu massive datasets on overdose deaths. Within seconds, it revealed patterns that will help public health officials identify high-risk users, tailor interventions, and get overdose reversal drugs into the hands of addicts.

The Big Idea: Eighteen months after OpenAI launched the revolutionary ChatGPT, this new tool is the company’s attempt to gain a foothold in the lucrative and still mostly untapped higher education market. But ChatGPT Edu also has the potential to pull the slow-moving world of academia—and hundreds of thousands of academics who have avoided generative AI out of fatigue, fear, or a whole host of other reasons—into the era of artificial intelligence.

For the early adopters, it’s a chance to shape how higher education applies generative AI to teaching and research.

“When you’re one of a handful of institutions piloting a program like this, you’re establishing your own ground rules,” says Kyle Bowen, ASU’s deputy chief information officer. “That gives you more power than if you wait until the technology is already in a mid-space where it’s not new. The fact that generative AI is advancing and evolving so quickly is why it’s important to be an active participant in defining the future for it.”  

Leaders and Laggards


Higher education on the whole has been “lethargic and slow” to experiment with ways to leverage this culture-changing technology, says George Siemens, an expert on human and artificial cognition and chief scientist at Southern New Hampshire University’s new Human Systems project, which is investigating the systemic impact of AI on learning and wellness.

According to a recent survey by Tyton Partners, only 36% of faculty and 40% of administrators use generative AI at least once a month, compared to 59% of students. Meanwhile, companies are expanding their use rapidly and exponentially: 65% of respondents to a McKinsey Global Survey said their organizations regularly use generative AI, nearly double the percentage 10 months earlier. 

That comes as a separate McKinsey study projects that by 2030, 30% of hours worked in the U.S. could be automated, with a big boost from generative AI. Past such projections have proven unreliable in their specifics, but no experts are questioning that change is underway.

Higher education leaders need to be driving the conversation about what the application of AI can and should look like for education, Siemens says.

“What we’re having instead is an abdication of responsibility,” he said during an episode of The Cusp, a Work Shift podcast exploring how AI is impacting the intersection of higher education and the workforce. “It’s almost like negligence to not sit down and have that conversation. It’s defaulting to, ‘Well, we’ll just let things ride rather than respond to it.’”

He likens it to the advent of MOOCs. Many of the most outsized ambitions fizzled, but the platforms changed how higher education is marketed and delivered to millions of people. And institutions, like the University of Michigan and MIT, that dove in from the beginning both developed a valuable expertise and helped cultivate a new market. They didn’t have to share huge chunks of revenue with the OPMs that later cornered much of that online market.

Michigan, this time in partnership with Microsoft, is yet again among the early adopters. Dozens of community colleges also have begun incorporating various forms of AI into their teaching and learning, with a boost from Big Tech. But the cohort that piloted ChatGPT Edu is certainly among the most advanced institutions in using and shaping generative AI.

Advanced Capabilities, Tighter Security


The Details: OpenAI decided to formally launch a version of its platform for higher education after successfully piloting a new version of its new model, GPT-4o, at ASU, the Wharton School, Columbia, the University of Texas at Austin, and the University of Oxford. ASU began the ChatGPT Edu pilot in January, with the other institutions joining later during the semester.

Though OpenAI announced the tool in May, it won’t be available to institutions more broadly until later this summer, the company says. OpenAI plans to charge institutions, not their students, to use the platform.

At the University of Texas at Austin, ChatGPT is only one of the generative AI platforms available to students and faculty. Art Markman, vice provost for academic affairs, said the university joined a pilot with the digital writing companion Grammarly, which built a large language model that can help faculty design better lesson plans, particularly in English composition courses. 

To generate sales of its platform, OpenAI is touting the benefits of ChatGPT Edu’s custom features unavailable on the public version. ChatGPT Edu is built for advanced text interpretation, coding, data analytics, document summarization, mathematics, and web browsing. It also is customizable, which means colleges can use the platform to build their own GPTs and share them within their workspaces. Importantly for security-focused colleges, none of the data entered into or produced by ChatGPT Edu will be used to train OpenAI’s models.

Bringing Faculty In: One of the challenges, Markman says, is catering to faculty whose tech skills run the gamut.

“We’ve got everybody from folks who want to be the first to use every technology to folks interested in the technology but need some handholding to folks who are just Luddites,” Markman says. “We’re willing to accommodate all of them.” 

At ASU, Bowen says, administrators are focusing on the culture of innovation around AI while they teach faculty, staff, and students about its benefits and challenges. 

“What’s important at this particular stage is developing broader understanding across the institution, but then moving it ahead and finding out where are the places where you think there is a difference maker with this technology,” he says.

Beyond Gen AI-Enabled Cheating


One reason academia has been slow to embrace generative AI is the initial and still widespread belief that it’s primarily a high-tech mechanism for cheating. Many faculty reject the notion that tools like ChatGPT will free students to engage in higher-level thinking either by performing mundane tasks or injecting new ways of thinking into their work. 

John Warner, an author and former writing instructor, wrote in Inside Higher Ed that “the way these things are being used is not as co-pilot assistants, but as outsourcing agents, subcontractors to avoid doing the work itself.”  

But some of the ChatGPT Edu pilot institutions are challenging that notion. ASU’s Amber Hedquist, doctoral candidate in the Department of English’s writing, rhetorics, and literacies concentration, was one of 13 students using the enterprise version of ChatGPT in a course called Writing for Scholarly Publication. Hedquist and her classmates asked the platform for feedback on their work, which she said felt like a form of peer review.

“I think I’ll always prefer talking to a colleague or a faculty member who understands my field as a whole to give me suggestions,” she says. “However, the GPT was really helpful in those instances where maybe I couldn’t wait a week for feedback.”

Hedquist can see herself using the platform when teaching undergraduate composition courses, so students can make sure their work matches the class rubric, visualize their writing differently, and get suggestions for rethinking their word choices.

Parting Thought: “We need to view the composition classroom as a place to get students starting to work with [AI], but then also teaching them how to work with it and engage with it critically rather than work with it as an input-output system,” says Hedquist, who specializes in technical and professional communication. 

“I think not incorporating AI into the writing classroom is a disservice to students at this point.”