As the workshop facilitator demonstrated the latest update in AI technology, we realized that we had lost the arms race of teaching and learning.
This particular platform, part of the Autonomous AI (AAI) toolkits, functions as a web browser, enabling users to open their learning management system and instruct the browser to “read” all posted course materials and complete all assessments. There were options to “slow down” so that the metadata looked more human; you could even instruct it to make a couple of errors. One of our colleagues tried it in their third-year course: the AAI browser earned a 72%.
You can watch the session on YouTube: https://youtu.be/3msiurIRBE8.
The video demonstrates AAI acting upon a course assessment — hosted by a Learning Management System (D2L Brightspace in this case) — that has been explicitly designed to evaluate higher order skill levels while providing equitable access to assessments (i.e. open internet, no lockdown browser, >2x time). The question presents students with a dataset that asks them to synthesize and interpret novel scenarios that reflect concepts they have covered in lectures.
Losing the arms race isn’t about all students or all teachers missing out. In many respects, this would be the best option because it would spark a drive for change. No, losing the arms race will be about the inequitable exclusion of students who rely on technology to access their courses, and about teachers who have been encouraged to upload/adapt and re-think placing their resources onto platforms that are now becoming increasingly useless.
Audio/video-recorded lectures can be captioned more easily than live lectures, giving better access to students who are Deaf or hard of hearing. Providing lecture slides ahead of time allows English Language Learners to preview the content and tackle any new vocabulary ahead of lecture. It also allows students with learning disabilities and those who are neurodivergent to better follow along with the class itself.
Online content often becomes the only resource for students with mobility challenges when attending lectures is made impossible, either temporarily by uncleared snow on sidewalks, or permanently by still inaccessible buildings and classrooms. Synchronous, online lectures that are then recorded increase the number of students who can attend classes. Similarly, online final exams, especially for distance-education courses, make accessing education possible by students in remote communities.
Removing these online resources as a strategy to address the challenges posed by AI would be akin to taking all cars off the road to address the safety risks posed by self-driving cars.
Higher education’s history of fighting back leads us to predict that we will not respond well to this latest technology update. In 2020, for example, institutions were quick to adopt a lockdown browser and digital monitoring system that disproportionately flagged Black and brown-skinned users. Upon learning this, some of us started a “desk lamp borrowing program” to help their faces appear lighter.
Academic judiciary committees are still hearing cases of academic misconduct related to Proctorio or Respondus. Most higher education institutions didn’t stop using them after we learned that they were racist. Instead, we developed policies that place the onus on the student to request an alternative testing format that, at the instructor's discretion, could be less accessible.
New policies and resources are required to stem this tide. The onus for creating such policies and providing these resources must not fall on frontline instructors. They must be institution-wide and developed through a collegial process that includes broad representation from students, faculty, staff and administrative leadership.
In the same week as the workshop, a memo from the administration was sent regarding AAI and what instructors should do. The actions fell flat, for now; instructors should: (1) be aware, (2) consider assessment approaches less vulnerable to automation, (3) remember that lockdown browsers such as Respondus are still available, and (4) remind students about the institution’s academic misconduct policies.
An institution without a policy regarding student use of AAI is wasting instructor expertise and time in the design and implementation of inclusive evaluations. It is shortchanging students who appear to have achieved the learning outcomes without actually engaging in the learning. It is breaking a social contract with future employers and community members who understand (understood) what it means (meant) to be a graduate of that institution.
Technology and the internet have increased accessibility to higher education across Canada, and our society is better for this. We must not respond to autonomous AI technologies by regressing to past ways or adopting punitive policies that fragment student experiences and access. Autonomous AI challenges us to reflect deeply on what skills and attributes we want our learners to have when they graduate. After all, anyone — human or not — can parrot information; what matters more is how we want students to apply it.
Yet the creative work our sector needs to engage with is inconsistent with provincial governments' approaches to resourcing education. If funding continues to be slashed, we will choose to regress to old ways. Old ways that will not support the bulging enrolments that we have sustained to increase revenues. Ways that will mean denying a university education to already marginalized disabled students, whose right to education is encoded in law.
Today, we find ourselves on a path forward with opportunities to creatively respond to changing societal needs. Let us not turn back to where we once were, with resourcing unable to support all students. Let us dig into a period of disruption with the passion and optimism needed to drive sustainable, meaningful change.
Dr. Shoshanah Jacobs, Dr. Alex Smith, and Dr. Daniel Gillis are professors. Chris McCullough and Amanda Ball are graduate students. They work collaboratively in the Department of Integrative Biology and the School of Computer Science (Gillis) at the University of Guelph.