Spark

Tech titans like Google and Facebook are built on a 'house of cards', says Oxford philosopher

From walls to WhatsApp, privacy is a collective responsibility, says Carissa Véliz, who researches the relationship between power and privacy. And, we take a historical look at privacy and its connection to technological change, with David Vincent.

Individual privacy is a relatively recent idea in human history, but now more important than ever.

Power weaves its way through our relationship with tech companies and that has a big impact on our privacy. (Adam Killick)

Privacy.

It's one of those things that everybody says they value, but that not everyone takes seriously, at least until it becomes necessary to do so.

If it's tempting to say, "well, I have nothing to hide," or "who cares if everybody knows that I like red shoes," then there some other things we should consider, according to Carissa Véliz, a philosopher at the University of Oxford and an associate professor at the university's Institute for Ethics in AI.

Philosopher Carissa Véliz researches power, ethics and privacy. (Oxford University)

But the data behind your data goes much further than you might think.

"I argue privacy is mostly a collective issue. And whatever personal data you have, that contains personal data about other people such that you actually don't have the moral authority to give up that data because it contains data about others, and they haven't consented to it," Véliz, who is also the author of Privacy is Power: Why and How You Should Take Back Control of Your Data, told Spark host Nora Young.

From a philosophical point of view, power is intimately connected to knowledge, she said. Many of us are familiar with the expression, "knowledge is power," but the reverse is also true, she pointed out. "If you have a lot of power... you get to decide what counts as knowledge.

"Google gets to decide what counts as knowledge about you, they get to decide what personal data is shared with others. And in some cases, it might be inaccurate, in some cases, it might be very biased, or just a slice of a view."

Most people see that manifesting in advertising, she said, which is reasonably benign. But there is more sinister work underlying it.

"Say you go into a webpage. And while the webpage is loading, a company like Google might be sending your personal data to hundreds of other companies that want to show you ads. And they offer different amounts of money in order to show you an ad, if there's one company that is reasonably confident that you might be interested in their ads, they offer more to show you that ad, and then you get shown the ad.

"But by that time, hundreds of companies have your personal data. And this is truly sensitive stuff, like your political tendencies and sexual orientation. And many times, you haven't had even time to consent."

At the heart of this are the data brokers, who collect the data scoured by the likes of Google and Facebook, and sell that information to the highest bidder, without really much regard about how that data might be used. It's also used in surprising ways: Véliz said men tend to get ads for higher-paying jobs. When you call for customer service, data about you might determine how much time you spend on hold waiting to be helped.

And no detail collected about you is small enough to escape the data brokers' attention, she added. "Even just something as simple as calculating the speed at which you walk can be used to infer your life expectancy. And of course, that can have effects for how much your premium is for your insurance company, or whether you get different kinds of insurance."

Facebook uses data in a slightly different way, to influence your personal relationships, she said.

"Because a company like Facebook is built in such a way that it is possible for hackers, say in Russia, to show personalized content to people in the US to inflame society. So they choose two groups, and they decide to pit them against each other. And they're incredibly successful. And there have been cases in which there are demonstrations on the ground. And on both sides, they have been organized by Russians."

Every act of resistance matters ... because companies are very sensitive to people's feelings about this.- Carissa Véliz

Perhaps even more disconcertingly, many national governments are complicit in the data collection, because it serves a purpose for them, too.

"From 9/11 onwards, governments realized that they could make a copy of the data and use it for the purposes of national security. And many times, tech companies have helped the government to surveil and governments also help companies by not regulating them, and by protecting them in different ways."

There is hope, however.

Companies like Google and Facebook are utterly dependent on personal data for their business model. So if we stop sharing so much data with them, they could collapse "in matter of a few weeks," she said. "So a company like Apple, and they might be criticized for a lot of things. But they have stuff that they sell, they have computers and phones that people can buy. Whereas Google doesn't have any of that—Android is too cheap for them to sustain themselves and they really thrive off the data. And Facebook is even worse in that regard."

Obviously, it's not reasonable to expect people to stop using Google or Facebook completely. "But every act of resistance matters a lot, because companies are very sensitive to people's feelings about this." She suggests using alternative apps that are more privacy-minded.

But aren't there times when it's important for security agencies to know data about people, so they can, say, track the perpetrators of the recent attack on the U.S. Capitol?

"Even if all our communications were encrypted, the police would have much more data than ever before in history, just because of metadata. Metadata is data about data. And it's data that is created by computers, when they interact with each other, it cannot be encrypted. And that already is enough data to find the bad guys whenever there is a problem, she said.

Personal privacy is a recent concept, historian says

The idea of "personal space" in the physical world—at least in the West—is a notion that didn't really exist more than a couple of hundred years ago, said historian David Vincent, professor emeritus at the Open University and the author of Privacy: A Short History and A History of Solitude.

Historian and author David Vincent. (David Vincent)

"The adolescent retreating into their rooms slamming their door in the face of their parents, is a luxury of the 20th century," he said.

Indeed, as recently as the 18th century, intimate activities like sex and going to the bathroom were done in common spaces, because there weren't many individual rooms in most houses. Most homes had only a single large bed that was shared with an entire family and even their servants.

It wasn't until the growth of the middle class that many of the aspects of our homes we now take for granted—doors, for example—began to appear.

Few homes, even mansions, were built with hallways, meaning that in order to get to one's own rooms they would have to walk directly through others' rooms, he said.

Events like the Great Fire of London in 1666 pushed forward technology that helped privacy, like mandating brick or stone walls instead of wood. And it wasn't until the Victorian era that thicker, sound-resistant walls were built inside homes.

It wasn't until 1890, in an American legal judgment, that the phrase "right to be left alone," first appeared, Vincent said.

"It was not how privacy was thought about in the past."

 

now