Canada's laws need updating to protect against abuse from surveillance tech, watchdog says
Recent uses of facial recognition technology by companies and police violated Canadians’ right to privacy
Originally published on October 8, 2021
Canada's laws are "woefully out of date" on protecting people's privacy rights from the potential harms of technology like facial recognition and artificial intelligence, according to a deputy for the country's privacy watchdog.
"Our private sector law is now over 20 years old, so it was passed and conceived of before social media such as Facebook, [which] was only founded in 2004," Gregory Smolynec, deputy commissioner for policy and promotion in the Office of the Privacy Commissioner of Canada (OPC), told Spark host Nora Young.
Today, tools one might have once read about in a science fiction novel are being used in tandem by companies to better target potential customers, or by law enforcement with the stated intention of identifying criminals.
The OPC describes its role as working to "protect and promote the privacy rights of individuals" including reporting on how citizens' private information is handled by both the public and private sector.
Smolynec said lawmakers likely didn't anticipate how quickly tech like AI, or facial recognition technology (FRT) could evolve to a state where it could threaten individuals' privacy, whether through inadvertent or deliberate misuse.
Ethics of facial recognition
To some, FRT promises not only to be able to identify faces and match them to images, but to do things like detect emotion, or identify who might be truthful or good employees.
Luke Stark, assistant professor at Western University's Faculty of Information and Media Studies, isn't convinced it's that sophisticated — or that it's even a good idea.
"Facial recognition really just looks at the kind of patterns of lightness and darkness on a face. It's not doing much to identify you. It's just looking for patterns of facial ridges and other features," he said.
More worryingly, he argued, this purportedly mechanical categorization of facial features quickly becomes "the textbook definition of racism."
Past reports have found that FRT are worse at identifying the faces of people of colour, he said. Some have said the problem is merely technical — include a wider range of people's faces and skin tones in the database, and you'll make it more accurate.
Stark has an opposite conclusion. "At its core, this is a fundamentally dangerous technology," he said.
Facial recognition controversies
The OPC has recently been involved in a handful of high-profile cases involving FRT.
Earlier this year, it found that several law enforcement agencies, including the RCMP, violated privacy laws by using FRT by Clearview AI, a controversial U.S.-based technology firm. According to the investigation, Clearview scraped more than three billion photos of people from the internet without their consent, then offered them to police services.
"I think Canadians … don't expect to be, so to speak, in a police lineup 24/7, in perpetuity. But this was kind of the effect of the collection of this database," said Smolynec.
At the time, the RCMP said "a few units in the RCMP" had been using the tech to "enhance criminal investigations," but didn't elaborate on exactly what they was using the technology for.
In the fall of 2020, the office found that Cadillac Fairview — the Canadian commercial real estate company that owns several malls in the country — collected biometric information with cameras installed in mall information kiosks.
Cadillac Fairview said the cameras' facial recognition technology was only used to gather anonymous customer demographics like gender or age. But the OPC said they also contained enough biometrics data to identify individuals based on their facial features.
New legal frameworks
Some of the OPC's projects aim to close that gap with new recommendations.
In late 2020, it proposed regulations on the use of AI to promote "responsible" and "socially beneficial" uses of the technology.
"Artificial intelligence has immense promise, but it must be implemented in ways that respect privacy, equality and other human rights," Commissioner Daniel Therrien wrote in a statement.
In June, the office drafted early guidance for police agencies on the use of facial recognition. It advised that any use must be lawful, transparent and cannot overreach beyond a stated operation's objectives. According to Smolynec, this consultation is slated to end later in October.
Smolynec pointed to laws in provinces and abroad, noting that Canada is lagging behind somewhat on establishing federal guidelines on the topic.
"There is, for instance, in Quebec, specific legislation that governs collection of biometric information, but not elsewhere in the country," he said.
Prior to the 2021 election, the federal government had its first reading of Bill C-11, the Digital Charter Implementation Act. With a focus on giving Canadians more control over their online data, it was the first major attempt to reform online privacy laws in decades.
Privacy as a human right
Updating these laws is important, Smolynec said, because it can have implications even beyond one's privacy.
The OPC is planning to announce details of a new relationship with the Global Privacy Assembly — an international alliance of privacy commissioners — in the coming weeks, he said.
"What we're saying is, privacy is a human right. It's instrumental to other rights and freedoms," he said.
"At the very least, we could bring our laws up to date to take into account some of these developments and to ensure that we protect the human right of privacy, but not only for the purpose of protecting the human right of privacy, but for protecting other rights and freedoms to which privacy is instrumental."
Even that might not be enough for Stark, who compared FRT to plutonium.
"I think we need to treat it the same way we treat ... nuclear waste. I think we need to be taught to not produce any more of it and get rid of what we got," he said.
Written by Jonathan Ore with files from CBC News. Produced by Nora Young.
MORE FROM THIS EPISODE:
- A previous version of this story stated the OPC's investigation into Clearview AI found the program had scraped images of three million people. In fact, more than three billion photos of people were found to have been collected.Oct 08, 2021 6:24 PM ET