How activists are fighting back against facial recognition

Academics and activists are calling out the dangers of facial recognition technology and advocating for a privacy first approach.

‘People have no agency, no power to fight back’

Shankar Narayan, legislative director of the ACLU of Washington, left, speaks at a news conference outside Amazon headquarters, Monday, June 18, 2018, in Seattle. Representatives of community-based organizations urged Amazon to stop selling its face surveillance system, Rekognition, to the government. They later delivered the petitions to Amazon. (Elaine Thompson/The Associated Press)

Smile, you're on camera. And it may know who you are, without you spilling a word.

For the coverage of Prince Harry and Meghan Markle's royal wedding, Sky News in the U.K. knew wedding watchers wanted the names of the famous and somewhat-famous guests walking into St. George's Chapel, in Windsor.

They turned to facial recognition technology. As guests sauntered in, the software automatically pegged who they were and popped their names up on the screen.

Some academics and activists have misgivings about the software. Over 70 civil and human rights organizations in the U.S. have penned an open letter to Amazon about the company's facial recognition service. They want the technology giant to cease the sale of its Rekognition software to police departments and government agencies

In part, the letter read: "Face recognition technology represents an unprecedented threat to privacy and civil liberties that even the best traditional forms of regulation will almost certainly fail to mitigate."

Meanwhile, in Canada, researchers and advocates are also fighting the rise of these systems.

Ann Cavoukian is the former privacy commissioner of Ontario. She now leads the Privacy by Design Centre of Excellence at Ryerson University. Cavoukian worries about the misuse of the technology.

It's already happened. A few years ago, a face recognition app called FindFace was used by trolls to out and stalk Russian porn actresses.

"We see with facial recognition, a lot of the time you don't even know it's taking place," said Cavoukian. "And the problem is once your facial image is captured, it can be used for a variety of different purposes—identity theft, etc. But also it can place you at a particular place at a particular time and it can enable tracking of your activities. That's nobody's business."

How facial recognition technology works

The software detects your face on an image and then assesses your features, things like the width of your nose or the length of your jawline. It then creates a code called a faceprint, much like a fingerprint. That faceprint is matched to a set of identified photos and faceprints to name the person in the image.

Avishek Bose is a researcher and graduate student in engineering at the University of Toronto. He's working on software that fools detection technology by slightly distorting online images.

"What it does is that it adds an Instagram-like filter on top of your face image, such that if humans were to look at it they wouldn't be able to tell the difference between the original image as well as the perturbed image," said Bose. "However the changes are small enough, yet strong enough, that if a computer vision models, such as face detection systems, were to look at it, it would completely fail."

Joey Bose also works in Artificial Intelligence and knows how facial recognition is everywhere and now that he's been able to trick one piece of tech, he expects an app to become available next year allowing users to keep their selfies for themselves. (Michael Cole/CBC)

Bose says in many cases, face detection systems are automatically used to track individuals or crowds and control behaviour.

"People have no agency, no power to fight back," said Bose. "And we felt that it was really important to actually empower individuals and people to actually take back control of their own data."

Companies must put privacy first in product design

Ann Cavoukian wants governments and companies to think about potential issues with the technology, before they develop into catastrophic problems.

"Imagine if you're sick and you go to your doctor and he says, 'Yeah, we see there's some cancer developing. We'll just let it develop and if it gets worse we'll offer you some chemo.' What an unthinkable proposition," said Cavoukian. "I wanted it to be equally unthinkable to allow privacy harms to develop. And then you would seek a remedy by going to the regulator, after the harms had been done.Too little too late."

For years, Cavoukian has advocated for an idea she created called Privacy by Design. It means companies must design with privacy at the core, rather than as an afterthought.

The European Union`s new rules on data protection that came out recently incorporate the concept.

"Proactively identify the risks and embed privacy-protective measures to address and prevent those risks upfront," said Cavoukian. "Bake it into the code; bake it into the data architecture, into your policies, into your operations."

About the Author

Manjula Selvarajah

Tech Columnist

Manjula Selvarajah is a journalist, producer and syndicated tech columnist for CBC Radio One. In her former role, she was vice president marketing at a Toronto-based tech startup and holds an engineering degree from Queen’s University. Send your story ideas to manjula.selvarajah@cbc.ca.

Comments

To encourage thoughtful and respectful conversations, first and last names will appear with each submission to CBC/Radio-Canada's online communities (except in children and youth-oriented communities). Pseudonyms will no longer be permitted.

By submitting a comment, you accept that CBC has the right to reproduce and publish that comment in whole or in part, in any manner CBC chooses. Please note that CBC does not endorse the opinions expressed in comments. Comments on this story are moderated according to our Submission Guidelines. Comments are welcome while open. We reserve the right to close comments at any time.