Colleges face a choice: Try to shape AI’s impact on learning, or be redefined by it
News > Technology News
Audio By Carbonatix
6:04 AM on Monday, February 23
By Vicki Baker,Linda M. Boland
(The Conversation is an independent and nonprofit source of news, analysis and commentary from academic experts.)
Vicki Baker, Albion College and Linda M. Boland, University of Richmond
(THE CONVERSATION) What happens to a college education when a chatbot can draft an essay, summarize a reading and generate computer code in seconds? The arrival of artificial intelligence in college classrooms has been swift and, for many schools, disorienting.
As professors of economics and business management and biology at liberal arts colleges, we are confronting a question that now cuts across all colleges and universities: What is the purpose of a college education, as AI is rapidly reshaping how students think, learn and prepare for careers?
While much of the public debatehas focused on plagiarism and credit for student work, the deeper issue extends beyond rule-setting.
Across higher education, most schools have issued guidance on how students should use AI, rather than adopted sweeping mandates.
Liberal arts colleges, like the University of Richmond, Bard College and Trinity College, tend to emphasize the importance of students using AI ethically and responsibly, and typically allow students to use AI when they cite it and their instructor permits it. These schools also allow professors to individually determine their own AI policies.
A 2024 study of 116 research universities found similar patterns, with instructors largely determining course policies and few campus-wide bans.
What’s unsettled is not whether students can use AI, but how institutions want students to use it. In our view, unless colleges clearly shape AI’s role in teaching and learning, fast-moving technologies may begin to redefine education by default. The risk isn’t more AI, but a gradual shift in what counts as learning.
Students may spend less time asking hard questions, making their own judgments and building real expertise. In that case, college risks becoming less about understanding and more about producing papers and other content quickly.
Letting AI into the classroom
When generative AI tools first became widely available in late 2022 and early 2023, most professors focused on finding and preventing it in student work. They looked for signs of AI use, including generic phrasing, fake citations, sudden shifts in tone or unusually polished writing that didn’t match a student’s prior work. Some faculty also used AI-detection software to identify computer-generated text.
But it is often difficult to tell when someone has used AI, in part because the detection software is unreliable. As a result, many faculty have shifted from bans to more structured guidance.
Some faculty, as a result, allow students to use AI for specific tasks, such as brainstorming, outlining or debugging code.
The rationale is practical: AI is everywhere and already embedded in professional settings. College graduates are likely to use AI in the workplace.
Accepting AI is here to stay
More recently, college faculty at a range of schools have shifted the focus from whether students are using AI at all to whether students using AI can still analyze, question and justify their own research and conclusions.
At the University of Michigan, for example, some faculty are redesigning assessments to include live debates and oral presentations.
And across the U.S., professors are reviving oral exams, since live questioning makes it harder for students to rely solely on AI. Students must then verbally explain their reasoning and defend their work.
Different academic fields, though, are approaching AI in various ways.
Many business programs, like the University of Pennsylvania’s Wharton School, have moved quickly to bring AI into coursework and degree programs, often framing them as workforce preparation.
Recent analysis of more than 31,000 syllabuses at a large research university in Texas showed a growing number of faculty in the fall of 2025 allowed students to use AI. Business courses allowed the greatest use of AI, while humanities courses allowed it the least. The physical and life sciences fell in between.
Across disciplines, AI was most often allowed at this school for editing, study support and coding. It was most commonly restricted for drafting, revising and reasoning or problem-solving.
AI’s role in higher education is not settled. Instead, it is evolving, dependent on different academic cultures.
Different schools, different approaches
Colleges’ and universities’ overall responses and approaches to AI are varied, as well.
Research universities like Carnegie Mellon University and Stanford University are expanding on their long-standing investments in AI, moving quickly to develop new research centers, hiring faculty with AI expertise and creating new degree or certificate programs.
Liberal arts colleges are moving too, but often with a different emphasis.
The Davis Institute for AI at Colby College supports AI work across disciplines through new courses, faculty development and entrepreneurship. At the University of Richmond, a new center links AI to critical thinking and human values, so students can study AI’s impacts and help shape it intentionally.
All of these schools are determining AI policy course by course. But these plans are not part of a comprehensive, school-wide strategy.
Few schools have articulated coordinated, institution-wide plans on AI. Arizona State University is one example of a broader AI integration strategy, which spans academics and campus operations.
Comprehensive AI strategies are expensive. Meaningful integration may require campus licenses for AI services, upgraded computing systems and faculty training. These investments are difficult at a time when many colleges face enrollment declines and financial strain.
Public trust in higher education is another concern that makes enacting broad change difficult. Gallup surveys in 2023 and 2024 found that only 36% of Americans had high confidence in colleges and universities.
Against this backdrop, AI is raising questions about how colleges prepare students for their careers. Employers still prize critical thinking and communication. Yet generative AI can mimic the appearance of thinking even when real understanding is absent.
The tension is clear: If AI does the writing, coding or analysis, where do students do the thinking?
Rethinking learning
Rising use of AI is forcing colleges and universities to revisit what students should learn, how to measure this and the enduring value of a college degree.
That shift moves the conversation beyond course-by-course changes to a shared strategy on what forms of knowledge and thinking are developed in college. Colleges may redesign assignments, expand oral and project-based assessments, and integrate AI literacy across disciplines. They may also clarify learning outcomes, invest in faculty development and find new ways to document students’ judgment and problem-solving in an AI-assisted world.
The question is no longer whether AI belongs in higher education. The real question is whether colleges and universities will shape its role – or allow AI to quietly reshape them.
This article is republished from The Conversation under a Creative Commons license. Read the original article here: https://theconversation.com/colleges-face-a-choice-try-to-shape-ais-impact-on-learning-or-be-redefined-by-it-275653.