By Lissa Cowan
It doesn’t make me nervous in terms of believing I’ll be replaced,” said Linda Meegan, head of the Professional English Language Development program at the British Columbia Institute of Technology (BCIT). “It does make me nervous in terms of its risks: plagiarism, loss of skills, loss of critical thinking, misinformation, overdependence on technology...”
Meegan is referring to ChatGPT, a type of artificial intelligence or AI — a branch of computer science focused on developing machines and computer programs that can learn and think like humans.
Released in November 2022, ChatGPT is an AI chatbot that follows a prompt entered by a user. Academic staff and students on campuses across the country are navigating the use of ChatGPT for activities like writing or editing, while similar tools such as OpenEssayist, Turnitin and GradeScope can be used to detect AI or help with grading.
“The technology can be used ethically, and it can also be misused and exploited. The ethical use of AI is a question that concerns us all,” said Sarah Elaine Eaton, associate professor of education at the University of Calgary. Her research focuses on academic ethics in higher education.
“There are strong indications from Microsoft and Google that by the end of 2025, AI technologies will be fully integrated into Microsoft Office and the Google Suite of products,” said Eaton. She added that educators and academics are unprepared for artificial intelligence programming that is embedded and fully integrated into our everyday technologies.
“Some will want to ban it, some will want to incorporate it into classroom activities or use it in the creation of teaching materials, some will allow it for student proofreading, and others — for example, those who teach English as an Additional Language — may not,” said Andrea Matthews, who teaches communications at BCIT. “Students will need to understand that what’s allowed by one instructor may not be permitted by another, and the onus will be on students to acknowledge when they have used AI in marked assignments.”
The addition of AI use by students to the workload of academic staff has potentially serious implications.
“For example, an academic staff complement wracked by precarity or inequity won’t be one in which enough faculty will have sufficient time, opportunities and freedom to engage with potential new tools or issues, whether productively or critically,” said Marc Schroeder, an associate professor of computer science at Mount Royal University.
Schroeder said that academic staff unions have been calling for more meaningful collegial governance at Canadian universities and have been working to foster the faculty voice within collegial processes. The issue of AI must be part of union actions to protect academic work and create a successful learning environment for students.
“These kinds of questions about academic policy, including who gets to make decisions about curricula and of choice of learning tools, learning activities and assessments, instructional modes, etc., just reinforce that faculty need to be participating fully in the institutional processes that shape the conditions of their academic work,” he said.
“This is a bit of a wakeup for the academy,” said Alec Couros, a professor of educational technology and media at the University of Regina. He added that he is constantly hearing from academic staff across Canada about possible cases of academic misconduct.
Since 2018, Eaton and her team have been analyzing more than 80 academic misconduct policies from across Canada as part of a national policy analysis of academic misconduct policies at publicly funded Canadian universities and colleges.
“I can say with confidence that all publicly funded Canadian post-secondary institutions have policies relating to student conduct and/or academic integrity. To the best of my knowledge, very few, if any, have clauses relating to artificial intelligence.”
She said that most institutions could use their existing policies to some degree to address misconduct involving the misuse of artificial intelligence tools. Eaton cites the Recommendations on the Ethical Use of Artificial Intelligence in Education published in May 2023 by the European Network for Academic Integrity as transferrable to Canadian institutions.
In Canada, several universities have issued guidance and policies addressing the use of ChatGPT, including the University of Waterloo and UBC. At BCIT, Matthews said their Student Code of Academic Integrity won’t change substantially with the rise of AI generative tools, and that “its statement that Students must ensure that all academic work they produce is their own captures unauthorized AI use.” She refers to BCIT’s Generative AI Tools guide as covering some of these topics.
Detection technologies like Turnitin are used by a significant number of universities, colleges, and other public post-secondary institutions worldwide to check student assignments for instances of plagiarism. Another tool, GPTZero that was created by Edward Tian, a 22-year-old computer science student from Toronto, detects whether a student has used ChatGPT. On the flipside, a student could use Quilbot to erase AI out of a paper before submitting.
“I’ve not been a supporter of these tools to detect when students are using ChatGPT and similar technologies,” Couros said, adding that a more appropriate question would be: “Why do we use these tools as an assessment technique that makes it easier for students to cheat?”
Eaton agrees with a more proactive approach to AI.
“Assuming that students are only using AI apps to cheat is reductionist, and this assertion is not supported by any current research.” She noted that students have been using apps powered by artificial intelligence for years, to check their grammar, for example. Eaton said that post-secondary institutions can and should be building and delivering undergraduate courses on how to use these technologies. She cited as an example the University of Calgary’s Haskayne School of Business that’s offering an undergraduate course in the fall on generative AI and prompt writing.
Schroeder also sees the advent of AI tools as an opportunity for academic staff to ask themselves questions about learning theories. “What specifically is it about them [learning and assessment activities] that we think contributes to student learning? How do they reveal, or not, that learning is taking place?”
How does this additional analysis affect workload? Eaton says that there must be a way to ensure students are learning and have opportunities to demonstrate what they’ve learned without further adding to the workload of academic staff.
“I have had faculty asking me when AI apps will be available to help them grade student work and what AI apps can help reduce the burden of their workload,” she said. There are also implications for academic labour such as educators needing training to learn how to use AI apps in their respective disciplines.
Whereas the pandemic prompted a reassessment of approaches to teaching and learning, AI is causing yet another disruption in higher education that academic staff unions must organize around. Shared decision-making structures and policies are required to bring together the collegium to navigate the challenges of AI. The uncharted territory of AI also requires funding commitments from the federal and provincial governments to tackle issues like wage disparity and the working conditions of academic staff.
“When the conditions of the academic job are healthy, we can better fulfil our roles in the interests of students and the communities we serve,” said Schroeder.