Back to top

Commentary / Privacy, technology and surveillance: What to watch for

Commentary / Privacy, technology and surveillance: What to watch for

By Tim McSorley

It’s no secret that governments and corporations have been collecting, using and analyzing our data. The tools they employ have been rapidly increasing in number and complexity — from spyware to social media crawlers to algorithmic analysis.

Canadian laws, however, have struggled to keep pace with technology. The ability to detect and punish the unlawful collection and use of our personal information has fallen behind.

There is an urgent need for new legislation to counter unethical surveillance. Unfortunately, it’s not just private companies, but government bodies — including national security and intelligence agencies — who have taken advantage of gaps in legislation and privacy protections, and even lobbied for them to be entrenched in legislation.

This winter, privacy, technology and surveillance are finally being debated on Parliament Hill. With academics at the forefront of important research on sometimes controversial subjects, they may be particularly vulnerable to unwarranted censorship or sanction. Below are some key areas to watch in the coming months.

Facial recognition technology (FRT)

In 2020, news reports revealed US firm Clearview AI’s illegal practice of scraping social media to create databases of billions of facial images for use by law enforcement around the world. This included the RCMP and various police services across Canada.

An investigation by the Office of the Privacy Commissioner of Canada (OPC) found that Clearview AI had broken Canadian law and that the RCMP’s use raised serious concerns. However, the OPC also pointed out that there are only a patchwork of laws governing facial recognition in Canada; essentially, nothing would prevent another Clearview AI from setting up shop in Canada, or force law enforcement to regulate its use of FRT to protect basic rights.

It’s imperative that the federal government consult with the public and introduce legislation to address this problem. The House of Commons Standing Committee on Privacy, Ethics and Access to Information also called for legislation following their months-long study of FRT, and even suggested a moratorium on the police use of FRT until new rules were in place. The government is required to table a response to the committee’s report but has not done so yet.

Encryption

Strong encryption is essential to protecting our personal information online, including from fraud and theft. Safe internet banking, medical platforms, online messaging and video conferencing apps would not be possible without it.

For years, the Canadian government was a strong proponent of encryption. Recently, though, in line with its Five Eyes partners of the US, UK, Australia and New Zealand, Canada started arguing that encryption must be weakened to allow law enforcement and intelligence agencies access to our private communications.

Beyond the fact that there is no legitimate reason for government agencies to have that kind of access, it also ignores that any backdoor created to read our encrypted messages would be readily available to authoritarian regimes, and eventually to hackers. It is imperative that the federal government change course through both public statements and legislation.

Social media surveillance

As government agencies work to find ways to access our private, encrypted communications, they have already begun the mass surveillance and analysis of our public communications. Using tools like Babel X, the RCMP engages in social media surveillance to monitor for threats, arguing that social media content is public information, so does not constitute spying.

However, posting or sharing on social media does not equate to consent that it be used in other ways, and even publicly shared information can constitute personal data. Furthermore, multiple pieces of information, shared separately, can together paint a very detailed (and privacy intrusive) picture.

In 2019, the government adopted new legislation that allows federal intelligence agencies like CSIS to collect “publicly available information” — a term not defined in the law. Privacy advocates have called for clarity, to no avail.

Facebook owner Meta is currently suing Voyager Labs, a social media surveillance firm, for creating 38,000 fake Facebook profiles to scrape data from some 600,000 users, in order to identify individuals who pose a “security threat” — information it then licensed to US law enforcement agencies. The company’s argument? That their analysis is based on publicly available information.

While we have no evidence of Canadian law enforcement using such invasive tools, we know the RCMP has created fake Facebook profiles in the past to spy on Black Lives Matter activists in Toronto. Little would stop them from also turning to this powerful form of social media surveillance, and even lying to us about it.

Bill C-27

The government introduced Bill C-27, the Digital Charter Implementation Act in 2022. It is a substantial, long overdue bill that would heavily modify federal private sector privacy laws.

Perhaps the strongest part of the bill is its proposal to finally grant the OPC order-making powers that are backed up by monetary penalties (although subject to approval by a problematic new tribunal). The bill fails, however, to firmly entrench privacy as a fundamental right, which would ensure that our personal information received the protection it merits.

The bill also includes the Artificial Intelligence and Data Act (AIDA) to regulate the use of AI tools in the private sector — including, for example, the development of facial recognition and algorithmic surveillance tools. However, it leaves the bulk of the definitions and rules to be determined by regulation after the bill is adopted. The bill also contains a blanket exception for the use of any AI tools by national security bodies or any other department prescribed by regulation.

Opposition parties have taken note of the AIDA’s dangerous flaws and are threatening to kill that part of the bill. While regulation of AI is urgently needed in Canada — to limit surveillance, among other things — entrenching such a bad piece of legislation would be worse than no bill at all.

It will be important that the public and civil society groups engage with government officials on these questions in the coming months. This includes academics, students and staff who rely on academic freedom, and freedom of expression and association so keenly for their work, and who have been and remain staunch defenders of these fundamental rights.

For more information and ways to take action, visit iclmg.ca or contact the International Civil Liberties Monitoring Group at national.coordination@iclmg.ca.


Combining his passion for civil liberties and social justice with his background in journalism, policy analysis and communications, Tim McSorley, ICLMG national coordinator, digs into the impact of government policies and works with allies and partners to fight for change. He is a graduate of Concordia University in Montreal, with a degree in journalism and political science. CAUT is one of the founding member organizations of the ICLMG.

Related

/sites/default/files/styles/responsive_low_constrict/public/feb-2023_news-legislative_priorities_for_2023.png?itok=iKGTuRwk
February 2023

News / Legislative priorities for 2023

By CAUT Staff Parliament returned at the end of January, and CAUT stepped up lobbying efforts to... Read more
/sites/default/files/styles/responsive_low_constrict/public/feb-2023bulletin-graphic-864x386.png?itok=64kf1T51
February 2023

CAUT Bulletin — February 2023

Download the full version. Read more
/sites/default/files/styles/responsive_low_constrict/public/nov-2022-peter.png?itok=8TiP2UN4
February 2023

President’s message / Service for the greater good

By Peter McInnis In the calculus of academic performance, service is often relegated to a... Read more