B.C., Alberta, Quebec watchdogs order Clearview AI to stop using facial recognition tool
Company that gathers images from across internet says it's no different from Google
Three provincial privacy watchdogs have ordered facial recognition company Clearview AI to stop collecting, using and disclosing images of people without consent.
The privacy authorities of British Columbia, Alberta and Quebec are also requiring the U.S. firm to delete images and biometric data collected without permission from individuals.
The binding orders made public Tuesday follow a joint investigation by the three provincial authorities with the office of federal privacy commissioner Daniel Therrien.
The watchdogs found in February that Clearview AI's facial recognition technology resulted in mass surveillance of Canadians and violated federal and provincial laws governing personal information.
They said the New York-based company's gathering of billions of images of people from across the internet — to help police forces, financial institutions and other clients identify people — was a clear breach of Canadians' privacy rights.
The orders from the provincial authorities Tuesday also require Clearview AI to stop offering its facial recognition services in the three provinces. Clearview has not been providing services to clients in Canada since the summer of 2020, but has hinted it could return.
Therrien's office lacks order-making powers similar to these provincial ones, prompting calls over the years to update outdated federal privacy legislation.
"We welcome these important actions taken by our provincial counterparts," Therrien said in a statement. "While Clearview stopped offering its services in Canada during the investigation, it had refused to cease the collection and use of Canadians' data or delete images already collected."
'Simply not possible'
The company told B.C. privacy commissioner Michael McEvoy in May it was "simply not possible" to identify whether individuals in photos were in Canada at the time the image was taken or whether they were Canadian citizens or residents.
In reply, McEvoy pointed to Clearview's intention, stated in a U.S. judicial proceeding, to limit the collection and use of personal information in the state of Illinois.
In his order Tuesday, McEvoy rejected the company's "bare assertion that it cannot comply" and concluded it does have the means and ability to severely limit, if not eliminate, the collection, use and disclosure of personal information of British Columbians.
"Put another way, this is not a question of cannot but rather will not."
Clearview says decision contrary to freedom of expression
Doug Mitchell, a lawyer for the company, said Clearview AI is a search engine that only collects public data just as much larger companies do, including Google, which is permitted to operate in Canada.
Given that Clearview is not operating in Canada now, the company believes the orders are beyond the powers of the provincial privacy commissioners, as well as unnecessary, Mitchell said Tuesday.
"To restrict the free flow of publicly available information in the sense proposed by the privacy commissioners would be contrary to the Canadian constitutional guarantee of freedom of expression."
Clearview AI left the Canadian market, but the problem created by their business model remains, the Canadian Civil Liberties Association said in applauding the provincial crackdown.
The company still holds, and uses, potentially millions of photos of people from Canada, which they continue to sell to policing bodies around the world, the civil liberties association said.
"This leaves potentially all Canadian residents who have ever posted photos online on a wide range of popular online platforms in a perpetual police lineup," the association added.
"We are profoundly concerned that the inconsistencies in privacy laws mean that millions of other people in other Canadian jurisdictions remain unprotected by this order."
The association says not only does facial recognition amount to a dangerous form of mass surveillance, it is fundamentally flawed given the technology's inaccuracies that can effectively discriminate against people who are not white.