'We behave differently if we know we're being watched'

Richard Lachman says Google's cancelled Sidewalk Labs project taught us valuable lessons about privacy

Surveillance technology subjects some people to more oversight than others

The now-cancelled Sidewalk Labs project aimed to create the most measurable community in the world. But would that have turned into a form of high tech surveillance embedded in the built environment? (Adam Killick)

What would it be like to build a city from the internet up? 

That bold question outlined the vision of Toronto's Quayside project, first unveiled by Sidewalk Labs in 2017. The smart city plan aimed to make urban life better for all. 

But not everyone shared that vision. The project raised concerns about data collection and privacy. 

In early May, the smart city project was cancelled, a story kind of lost at a time when city streets were deserted because of the pandemic. 

But Richard Lachman was paying attention. He's the Director of Zone Learning and an associate professor, Digital Media, at Ryerson University. In an article for the online magazine Policy Options, he outlined why he believes the project was a good stress test for democracy that revealed needed repairs to our privacy protections. 

Lachman spoke with Spark host Nora Young about how data gathering can be a form of surveillance that subjects some people to more oversight than others. 

Here is part of their conversation.

So the now-cancelled Sidewalk Labs project in Toronto originally promised to make Quayside the most measurable community in the world. Why might a measurable city be concerning?

The challenges, when we think of using data for one purpose, is it doesn't only have to be used for that purpose. So the question is: where does that data go? It's not that we necessarily know it's going to be used for nefarious purposes but I can't guarantee it. Once it starts being shared, then I'm much more worried.

Richard Lachman (Will Pemulis)

But do you think there's a potential value in a measurable city, the privacy question accepted or assuming the privacy is baked in? 

Absolutely. I mean if we think of the census being run in this country, better data can lead to better policy decisions. Where do people need attention? How many people use this service instead of another, and which people? 

It allows us to target where we spend government money. It allows us to maybe target where we need better healthcare programming, and maybe allows us to track things like availability of healthy food, availability of exercise, availability of city services. There are positive and negative aspects to it, and a measured city has wonderful potential for being able to deliver services in a much better way.

You wrote that the smart city concept pulls surveillance tech from our smartphone and embeds it into the built environment. So why does that move concern you?

We seem to be sharing with companies right now with no oversight. And why would I be worried about putting that off my phone and now having the city or the province or the country have access to it? 

But when some of these privacy implications start affecting my relationship with law enforcement, my ability to cross a border, my ability to get a job, it affects even more about my life. 

Thinking of that moving into the hands of the state is a big shift. So imagine you get pulled over by a police officer, you've maybe been going five kilometres fast in a neighborhood. The officer might let you off with a warning or might not let you off with a warning. They take your license, they go sit in the car, and they look up something. I don't know what they look up at the moment. I'm assuming they're looking up if I have speeding tickets. 

But if that instead goes into an unknown algorithm that's tying together a lot of different databases, maybe my border travel, if it's tracking my employment records, if it's tracking my healthcare or mental health records, that's scary. That's potentially saying an unknown algorithm that has no public transparency is producing a yes or no that judges and constrains some of my abilities to move around in society.

So in the context of the protests, what lessons do you think we can take from the whole Sidewalk Labs experience?

I think we are seeing cases where being able to document a few things is something really helpful. So how many people were at this protest? How big is this issue, and what exactly was happening in which locations at times? That's helpful both from a state point of view and helpful from an individual point of view. 

So one worry would be if a place like Sidewalk Labs, it doesn't have to be them, but imagine that level of monitoring is there. How would that change the protests that are happening now? We've seen cases of kettling. We've seen cases of police trying to document things and maybe tracking individuals, so that would be worse. 

It's really hard because what we're doing is conducting an experiment in real time at scale. That was always the scariest part about Sidewalk Labs: that it used a lab methodology, used sort of a design thinking approach. Let's test this and if it doesn't work we'll revise it. 

That's the speed of software. Release the software and then we'll update it again next week. That's really hard to do in society. It's problematic to do because real people's lives are affected. 

Someone really is going to jail, someone really is not able to access a service, and your software update didn't take away the effect on that person's life. So what we're really seeing is the danger of doing this at scale because it affects society; it doesn't affect just a tiny little focus group.

This interview has been edited for length and clarity. To hear the full conversation with Richard Lachman, click the 'listen' button at the top of the page.