The role of university education in the age of AI



This is part of an ongoing GenAI op-ed series, Arts Perspectives on AI (Artificial Intelligence), that features student and faculty voices from the UBC Faculty of Arts community.

This image was generated with the assistance of Artificial Intelligence.

By Ritwik Bhattacharjee, Communications Specialist and PhD candidate in Interdisciplinary Studies

Artificial intelligence is raising urgent questions for educators, who are navigating how to balance AI’s impact on literacy and learning with the realities of a hyper-competitive academic environment.

While AI tools offer new possibilities, their growing use has also sparked concern about critical thinking, knowledge production, and the meaning of comprehension in higher education. For example, AI hallucination—that is, when a large language model (LLM) creates nonsensical or inaccurate output out of nonexistent patterns or objects and presents that as fact—is playing havoc with established knowledge bases that are essential for learning and development. A recent study has shown that even when AI provides factually accurate summaries and syntheses of a topic, it develops shallower knowledge compared to a traditional human-led web search involving active discovery and synthesis from individual links and sources.


Saskia Tholen (she/her)
Alumna, Master of Arts in Political Science, UBC (’25)

We recently spoke with Saskia Tholen (she/her), UBC Political Science alumna (MA ’25), who explored these tensions in a widely-read article for The Ubyssey. Tholen argues that the rise of AI has thrown the contested role of the university into sharp relief — particularly the tension between fostering human understanding and functioning as a training ground for the workforce.


What led you to write the op-ed piece?

The rapid development and uptake of generative AI-powered chatbots like ChatGPT, Claude, and Gemini over the past three years has made AI far more visible in everyday life and public discourse, and its use is near-ubiquitous among college and university students. There’s a palpable sense of anxiety about the prospective impacts on the nature of work and the labour market, especially among young people thinking about their futures. I feel it as strongly as any of my peers. When I began working as a Teaching Assistant at UBC, I saw firsthand just how dramatically AI is affecting higher education. This gave me a concrete foothold in a problem that had otherwise generally left me feeling powerless.

I decided to write this piece because I worry that we aren’t having the right kinds of conversations about AI in education. The discourse strikes me as depoliticizing and narrowly technical, divorced from deeper pedagogical, social, and philosophical questions. This is partially because we don’t appreciate the full extent of the disruption AI is going to cause, and because the deeper questions are hard to answer. But I think a reckoning is coming whether we like it or not. I also wanted to encourage fellow students to reflect on how their AI use affects their own hopes for their education.

“When I began working as a Teaching Assistant at UBC, I saw firsthand just how dramatically AI is affecting higher education.”
Alumna, Master of Arts in Political Science ('25)

What’s the key message you want people to take away from the op-ed piece?

The main idea is that AI brings to light fundamental questions about whom the university should serve and what its purpose should be—but that those questions didn’t come from nowhere. AI is drawing out an underlying identity crisis in higher education, which is this: is the university an insulated space for human self-development and the pursuit of understanding, or is it a practical training ground for the workforce? Is the value of education intrinsic or instrumental?

In my view, it’s likely that the arrival of generative AI will serve as a focusing event. It will bring unprecedented changes in the nature of teaching and learning, and in the economic pressures that shape the education sector. It has the potential to solidify one or the other competing vision for the future of the university, so it makes the deeper questions impossible to ignore. We should treat this moment not as a temporary shock that an otherwise well-functioning system needs to contain, but as a crucial chance to reimagine higher education.

“The arrival of generative AI ... has the potential to solidify one or the other competing vision for the future of the university, so it makes the deeper questions impossible to ignore.”
Alumna, Master of Arts in Political Science ('25)

As a TA working directly with students and evaluating their work, how do you think AI has impacted the quality of work, student interests and critical thinking skills?

This is a qualified answer, because I didn’t teach before AI, so I can’t make a direct comparison. My intuition is, and every seasoned educator I’ve spoken to about this tells me the same thing, that we’re facing a precipitous decline in literacy and critical thinking skills.

Many, if not most, students use AI for some or all of the tasks an Arts degree involves: locating sources, reading, synthesizing information, generating ideas, outlining, writing, and editing. Obviously this is a concern from an academic honesty perspective, but even when students aren’t passing AI-generated assignments off as their own work, the potential effects are concerning. Slogging through the challenges involved in research and writing is how we develop crucial faculties like memory, attention, reflection, reasoning, and creativity. Over-reliance on AI jeopardizes this.

Of course, being able to delegate rote tasks is a gift for overworked students facing relentless pressure to achieve higher grades. Chatbots are powerful labour-saving devices, and that’s valuable. However, there’s a fine line between employing such devices strategically and relying on them completely because you’ve never developed the skills yourself. Increasingly, students seem to view the process of learning as a burden to be offloaded, and I think this contributes to a worrying tendency toward passivity and a lack of motivation.

I hear a lot of students talking about AI as a “tool” that they need to leverage if they’re going to get ahead in the world. I’m not sure they’re considering how AI might be reshaping the way they think—not to mention how much less interesting it might make their educational experience. Slogging through isn’t easy, but at least it allows for deep engagement with the content, and it certainly provides more stimulation than acting as manager to an AI—an administrative accessory to an artificial brain. AI saps some richness from students’ intellectual worlds in that sense.

“Increasingly, students seem to view the process of learning as a burden to be offloaded, and I think this contributes to a worrying tendency toward passivity and a lack of motivation.”
Alumna, Master of Arts in Political Science ('25)

What does democratizing education mean to you? Is AI posing a challenge or helping in this regard?

Democratizing education simply means dismantling barriers and increasing access; it means opening education up to more people and more kinds of people. I think AI’s potential here is complex. On the one hand, these tools can promote accessibility and equity by making schooling more personalized and adapted to suit students with diverse backgrounds and learning needs. The conventional format for Arts programs caters to a particular learning style, and it certainly helps if you’re neurotypical, English is your first language, and you were brought up in a Western school system. AI products offering assistance with notetaking, grammar, and so on can help reduce these barriers, which are arbitrary with respect to merit.

On the other hand, some people predict that the rise of generative AI could filter many people out of higher education, at least in the humanities. The idea is that precisely because AI can take over many of the basic functions of academic work, the university will become a place only for the students who are intrinsically motivated to learn—since they can no longer be made to learn if they don’t want to. AI might also catalyze this kind of narrowing in higher education by decimating job markets for certain kinds of graduates.

How do you think higher education can adapt to AI, going forward?

Universities are currently in reactive mode, and the response to AI has been quite piecemeal. We need to start responding proactively; we’ve seen that taking a “wait and see” approach to new digital technologies harms the young people who become the guinea pigs. And we certainly shouldn’t focus our energies on penalizing students for using AI. We also need a unified response, so that the burden isn’t on individual educators to figure out how to cope. I do believe there’s a place for generative AI in the university, especially if we can harness its capacity to make education more flexible and open without jeopardizing skills development and learning outcomes. Neither an uncritical embrace nor a wholesale rejection of this technology seems wise.

As I’ve argued, universities need to go beyond debating course policies and confront the identity crisis in higher education head-on. I’ve tried to suggest that some middle ground between the intrinsic-value and instrumental-value visions is possible. The university shouldn’t be an ivory tower sealed off from the world of practicality, but neither should it be a dehumanizing factory where only grades and credentials—the products, not the learning process—matter. The AI disruption offers an opportunity for us to work out exactly what this future looks like.

“The university shouldn’t be an ivory tower sealed off from the world of practicality, but neither should it be a dehumanizing factory where only grades and credentials—the products, not the learning process—matter.”
Alumna, Master of Arts in Political Science ('25)

Saskia is grateful to live and work in Tkaronto, on the traditional territories of the Mississaugas of the Credit, the Anishnaabe, the Chippewa, the Haudenosaunee, and the Wendat peoples. These lands are governed under the Dish With One Spoon wampum agreement, and more recently under Treaty 13.


About the featured image

The featured image was generated with the assistance of ImageFX which is an AI-powered image generator.

The prompt used was: “Create an illustration of a sandcastle in the shape of a university building. A strong wind is blowing, pulling the sand away from half of the building to create a thin cloud of dust around it. the building should be centred on a solid purple background.”


About the author

Ritwik Bhattacharjee is a PhD candidate in the Interdisciplinary Studies Graduate Program and a Communications Specialist for the Faculty of Arts. His interdisciplinary project works towards dismantling settler colonialism by investigating the role of socially unconscious background assumptions that are resistant to transformative change.


From the “Arts Perspectives on AI” series






Relevant resources