Skip to main content

When AI is mentioned, most post-secondary educators will think first of generative AI (GenAI) technologies. Over the past two years, student use of tools like Chat-GPT and Copilot have come to dominate many of our conversations. As my techno-optimist colleagues welcome AI into the classroom, many others have resigned themselves to preparing students for a world where AI is inevitable. 

Those of us who study the social and economic impacts of AI systems still resist –— citing research about AI bias, false information and “hallucinations;” the environmental costs of these technologies; and the recent findings of cognitive decline among generative-AI users. 

As the opportunities, risks, and impacts that accompany this new wave of digital technologies are fiercely debated, there is at least one thing that proponents and antagonists can both agree on: AI is impacting post-secondary institutions — often in disruptive ways. 

Irrespective of our orientations, AI has rendered many of our traditional learning assessments futile. More tiresome than reading a term paper written by an AI chatbot is having to read ten of them, and the only thing more exhausting than that is the paperwork required to report a violation of the institution's academic integrity policy.

Post-secondary institutions haven’t yet reckoned with the additional workload associated with unauthorized AI use, and they’ve failed to consider the impacts that policing AI use might have on students’ teaching evaluations. Or perhaps they don’t care. After all, many schools are giving students discounted or free access to these tools.

While the AI boom and generative AI have captured the imagination of many (and colonized the nightmares of others), GenAI is only the most recent trend in a much larger digital turn affecting all aspects of our lives. This digitalization trend is fueled by the large-scale collection of data about our personal and professional activities, which are then algorithmically processed for a vast range of uses. The algorithms curate our online social media experiences; price our concert tickets, airplane flights and increasingly online groceries; and within workplaces, algorithms are being used for everything from evaluating job applicants to monitoring computer networks, tracking employee locations, assessing worker performance and more.

As educators, we seem prepared to consider what impact these technologies will have on our crafts of teaching and research, but rarely do we consider how these trends affect us as workers and public sector employees. What data are our educational institutions collecting about us? How are they using it? And how can we ensure that our rights and interests are protected in the age of algorithmic management?

Why data rights are a union issue 

As part of my research, I conduct interviews about how people feel about their privacy and data rights. From warehouses to hospitals, workers sometimes tell me that they don’t mind being tracked because they have nothing to hide. This position, however, misses the point and underestimates the risks.

Scholars have made compelling arguments for the value of preserving one’s personal privacy as a general principle at the individual level. Yet there are also crucial reasons to treat data privacy and rights as a collective issue. Simply put, the power of workers’ data isn’t just in the performance or activities of a single person — it is in the patterns and trends that can be derived by processing the data of all of us.

Let me give you some examples: Want to know who is a productive scholar? You need to know the average number of publications per year and the associated journal rankings. Want to know who spends the most time in the lab? You need to know how often people are present. Want to know if the ethics office needs an additional hire to process ethics applications within a predetermined time frame? You have to know the average rate of work.

At York, where I teach, the university created a campus-wide task force on AI. This task force recently introduced a new module on how to use AI for course planning and is using AI to “automate complex workflows” to “free up staff to tackle more impactful work.” This adoption, however, is taking place amidst the chronic underfunding of higher education.

Institutions like mine aren’t hiring to fill the vacancies left by retirees; those jobs are being replaced by algorithmic robots. Need help from a real human? You can expect to work hard to find one, and then you can expect to wait — because humans are a disappearing species in this age of austerity, and those who remain are stretched thin. It is a work quality issue for faculty and staff alike.

Another reason that data rights are a union issue is about ensuring that our rights aren’t eroded in the years to come. Unlike other resources, the data collected about us today may be used in the future in ways we have not yet imagined. Metal keys have been replaced with digital key cards. No longer do these tools merely grant access to the building; they now provide insight into how often we go to the office, where on campus we visit, and how often we access library resources.

These data could provide ample justifications for an administration to propose eliminating offices, reducing parking, or cutting back on library subscriptions. When you consider the monitoring of email and data storage, printing devices, work order requests, CCTV cameras, access cards, location devices, chat functions and software tools, it becomes clear that there is no shortage of data, and the array of potential uses is endless.

The need to bargain for better data rights 

Management rights are typically far-reaching and constrained only by law and the collective agreements we negotiate. Given that Canada lacks a robust personal data protection regime, labour unions across the country can help to close this gap by bargaining for better data protection rights for their members.

Unions should aim to secure rights related to all parts of the data lifecycle, from collection to disposal. This includes:

  1. How and under what circumstances data is collected about workers
  2. How data is processed and stored
  3. How data is permitted to be processed and what processing restrictions are in place
  4. How workers’ data is stored and deleted or otherwise disposed

Most importantly, unions should negotiate for ongoing consultation and co-governance rights about how digital technologies are selected and implemented at work.

Public resources for the public good 

Recent polls show that despite the vast investments by many organizations into digital technologies and AI “solutions,” most organizations haven’t realized the efficiencies that these tools promise to deliver. If you watch the markets, you know this hasn’t slowed investments.

Instead, much of this trend within our educational institutions could be aptly described as taking public money and public data — about workers and our students — and pouring it into the coffers of private corporations that are based, overwhelmingly, in the United States and that are busily developing the next algorithmic innovation to which we will be subjected.

It is time we had some guardrails. As unions move toward the bargaining table, let us all remember that an alternative is possible: one where workers have rights and access over the data they produce, where public money is spent in the public’s interest, and where public post-secondary institutions are funded adequately because they are a public good.

Dr. Hannah Johnston is an assistant professor at York University in the School of Human Resources Management, where her research focuses on the digitalization of work.