Spark

The vices and virtues of planned obsolescence

Planned obsolescence has been around for longer than you think. But just how dangerous is the constant-upgrade train?

Trapped in an upgrade cycle whether we like it or not

But how does it fit on a turntable?

Many of us are familiar with "planned obsolescence," the idea that our devices are designed to become useless long before they break down or stop functioning, forcing us to upgrade whether we want to or not.

But where did this idea come from, and does it have any benefits?

Jonathan Sterne, a professor of communications studies at McGill University, traces planned obsolescence back to the late Victorian fashion industry.

"Advertising became more important. Brands started to exist. And fashion started moving at a faster pace. Fashion has existed forever. But fashion as sort of a mass interest came with mass newspapers, mass magazines and also advances in printing technology, especially chromolithography and stuff like that, where you can have colour pictures and magazines," he told Spark host Nora Young.

From there the idea took hold in the burgeoning automobile industry, he said. But the concept really took off in the post-war manufacturing boom.

"Manufacturers often started using less good materials, economizing on how they built things, so things would break down sooner, warranties got shorter. Because they realized if you're selling more refrigerators, it becomes harder to support all the people with the refrigerators forever."

That was matched with a growing sense of consumerism, brought on by increased wealth in the 1950s and '60s.

"So we have this situation where people are simultaneously being convinced that they need to buy more things, and that buying is more and more important to their identities, as manufacturers, as businesses are speeding up the rate at which physical objects in your life are supposed to be replaced."

Jonathan Sterne, McGill University (sterneworks.org)

The idea of planned obsolescence became structurally ingrained in 20th century capitalism, as shareholder dividends and, later, venture capitalists wanting return on investments helped push the pace of innovation and iteration.

The problem is, that structure doesn't account for the costs of the constant-upgrade cycle: emissions from manufacturing, toxic waste and harmful labour practices, he said. "We need a different way of thinking about things, objects and progress, then, newer, faster, better."

He conceded the line between true innovation and inconsequential upgrades is often fuzzy. "I would say that the history of technology is littered with examples of things that were better that didn't succeed commercially."

Do you really need four cameras on a smartphone?

Mohan Sawhney agrees with that idea of fuzziness between something that's truly new versus iterative innovation that forces people to upgrade their technology. Sawhney is the Associate Dean for Digital Innovation at Northwestern University, and the McCormick Foundation Professor of Technology at Northwestern's Kellogg School of Management.

There's a perception that devices like smartphones are deliberately rendered obsolete, he said, "but the fact is that the older hardware may not have the capacity to run all of the new features in your operating system or in your software, and trying to tax it too much to run, all of the new features may actually take away from the performance," he said.

Mohan Sawhney, Northwestern University (mohansawhney.com)

Sawhney acknowledged that there are trends in technology that wax and wane. Smartphones have been getting bigger and bigger, for example. "Funnily, if you go back in history, and Motorola created the StarTac, it was all about making the phone smaller. And it became smaller and smaller to the point that Sony had one so small, that you could either put it to your ear, or you could put it to your mouth."

Similarly, cameras have become vectors for innovation lately, he said, adding that as many top-end phones now sport three cameras, it might be time to move to another vector.

"Now, even directions or vectors are becoming harder and harder to find. First, it was the obvious one, make it bigger, make the display better, then it was like, 'OK, let's make the camera better'. And now it's more subtle things like maybe using artificial intelligence to process your images or to offer voice assistance and so on."

Moreover, many companies are moving toward abandoning the concept of ownership altogether. Sawhney calls this "servicization," or the idea that companies are selling a service more than a product.

"Take a company like Adobe: you cannot buy any Adobe software, not any software that Adobe makes. You cannot buy it. It is only rented to you."

And that model is now being extended to hardware. "To the extent that you are subscribing or doing sort of a hardware lease, in effect, the [upgrade] problem gets shifted from the customer to the vendor. So now, if my hardware is obsolete, it's not my problem, but it's the vendor's problem."

Sawhney concedes that servicization still doesn't address the environmental cost of planned obsolescence, which is becoming more concerning to consumers. "I think more companies are paying more attention to recycling and to the environment for different carbon footprints and so on. But they're all riding a tiger, like you can't get off."


Written by Adam Killick. Produced by Adam Killick and Michelle Parise.

 

now