Arts Perspectives on AI: Faculty discuss integrity and innovation



Developments in generative artificial intelligence (AI) have initiated a lot of discussion around the opportunities, challenges, and ethical questions of AI in higher education. While there’s no blanket approach to the use of AI, its effect on teaching and learning calls for quick and creative responses.

Dr. Laurie McNeill and Dr. Patrick Parra Pennefather participated in the “Arts Intelligence on AI” panel discussion in August, along with other Arts faculty. The panel raised thought-provoking questions about acceptable uses of AI for students and teachers, while emphasizing that a critical relationship with the tools is crucial.

Laurie and Patrick have integrated AI in a variety of teaching and research endeavours for many years. Here, they share what they have learned about using it in course design, and what they think are the biggest AI-related opportunities or challenges facing students and faculty.

This is part of an ongoing GenAI op-ed series, Arts Perspectives on AI (Artificial Intelligence), that features student and faculty voices from the UBC Faculty of Arts community.


Dr. Laurie McNeill
Professor, English Language and Literatures;
Associate Dean, Students

How have you approached teaching students about AI practices?

In my courses, I’ve approached AI primarily through an educational framework for the other “AI”: academic integrity. That is, I’ve been thinking about how we help students learn about AI tools, such as ChatGPT, and how to use them with integrity. As educators, we need to create the conditions for all students to understand the technology, how its use may or may not support the learning they are being asked to demonstrate in their courses, and what responsibilities they have as students and as users of these tools more broadly. Those responsibilities must include asking critical questions about who has built the AI, who will profit from that labour, who might be harmed by it, and what we are agreeing to when we sign up to use it. For many of us, that requires learning alongside or from our students, a real opportunity to collaborate with them on developing dynamic principles and practices about AI in the specific contexts of courses and disciplines.

What do you believe are the biggest AI-related opportunities or challenges facing faculty and students?

In my own discipline of English literatures, and specifically in my field of auto/biography studies, AI poses some fundamental challenges to the idea or understanding of authorship and authenticity, in ways that will spill over to our courses and our teaching. For example, do we analyze texts differently when we know that the writer has used GenAI, which will have produced that content based on its aggregation of texts by other people? Just as such tools can challenge the fundamental expectation we have that students do their “own work” (what does “own work” mean, now?), so too do they invite literary scholars to clarify for students the many ways that close work with works of literature, attending still to words on the page or screen – whether written by human or AI – is more urgent and productive than ever.


Dr. Patrick Parra Pennefather
Assistant Professor, Design & Production
Department of Theatre & Film

How have you approached teaching students about AI practices?

Generative AI is integrated into all aspects of a course design. It’s important for students to experience using AI in their academic life so they can develop informed opinions as to its benefits, constraints, and challenges. Having a critical voice in relation to AI is important as students prepare for the industries they’re transitioning into – an increasing number of organizations are slowly integrating AI into different company workflows and processes.

What can we learn from your research on AI practices and how you integrate it into your course? 

While the use of AI in academia seems to be receiving much negative attention because of the misuse of LLMs like Chat-GPT, I have been integrating AI in sound design classes since 2007 with generative audio, noise reduction software, and other virtual plug-ins to support the creation of sound. In my research, I am locating examples of instructors who are discerning in how they integrate generative AI in their learning environments, but also see its advantages. 

Using AI as a tool to support the creative process, and motivating students to be transparent about how it is used, is important. This can be achieved by using assessment rubrics that reflect the process as much as a final media deliverable. In other words, if you are going to use generative AI, tell me how you used it, share a screenshot, and explain how effective it was and how the technology contributed to your creative process. Integrating AI in learning environments can demonstrate how it can support, but not replace, our creativity.

What do you believe are the biggest AI-related opportunities or challenges facing faculty and students?

The biggest challenges in integrating AI are in thinking that it’s a “silver bullet” that solves all teaching and learning challenges. Conversely, the opportunities of AI appear in the ways it can inspire ideas. In addition, the biases inherent in all AI-generated content bring to the surface our own biases when we create media. Understanding how AI systems work, how they are trained, how content is used, how it is labelled, and how the algorithms function could also help clear misconceptions and worries that people have about the technology. Once you know how they work, you become more concerned with how to interact with them appropriately, if at all.