Ideas

Tech's Moral Void

Lawyers and doctors have a code of ethics. Teachers have them. Even journalists have them. So why not the tech sector, the people who create and design our very modes of communication?

Tina Pittaway asks: What will it take for tech giants to confront their moral void?

As tech companies grow at breakneck speed, and the chaos of their unfettered impact becomes more obvious, the call is coming for a reckoning. What will it take for tech giants to confront their moral void? (Pixabay)
Listen to the full episode53:58

Lawyers and doctors have a code of ethics. Teachers have them. Even journalists have them. So why not the tech sector, the people who create and design our very modes of communication? Coders and designers make products that allow to us communicate with each other, across cities and nations and borders. How we speak and how many we reach determines what we buy and sell, affects our health and economy, and — as we've come to realize — influences our democracy. Contributor Tina Pittaway explores whether the time has come for tech to reckon with its moral void.

They made these platforms very evidently not neutral, very evidently not a blank canvas and very evidently controlled by the creators of those platforms. And yet they were still saying well we're neutral and we're just a blank canvas.- Anil Dash

For decades, technology was seen as neutral. Its meaning was determined by how it was used. Just as a pen can be used to write things that build up or tear down, so too can a communication platform be used in the same way. But what happens when creators and designers of technology imbue it with their own biases?

These biases themselves may not be nefarious — we may not be talking about overt racism or misogyny — but we all carry unchecked assumptions around with us. For example, given we have a tech sector that is overwhelmingly white and male, and designers design for the world they know, should we be surprised when self-driving cars can't detect black pedestrians?

YouTube may be seen simply as a platform to post videos. But when engineers design algorithms that choose what content the platform amplifies or buries, where are we left when we start seeing more violent and racist content — so much so that some users are even being radicalized? How do we deal with technology that isn't designed with the most vulnerable in mind?

Anil Dash is the CEO of Glitch in New York City. 2:48

These questions and concerns are pushing long-time designers and engineers to ask the hard questions about the moral obligations of those in the tech sector and what it means now in the digital age to do no harm.

Guests in this episode:

  • Mike Monteiro is design director and co-owner of Mule Design Studio in San Francisco.
  • Anil Dash is the CEO of Glitch in New York City.
  • Zeynep Tufekci is associate professor at the University of North Carolina's School of Information and Library Science and Adjunct Professor, Department of Sociology

Further reading:

  • A Designer's Code of Ethics by Mike Monteiro, published by Scout Books, 2017.
  • Twitter and Tear Gas: The Power and Fragility of Networked Protest by Zeynep Tufekci, published by Yale University Press, 2017.


**This episode was produced by Tina Pittaway and Naheed Mustafa.

Comments

To encourage thoughtful and respectful conversations, first and last names will appear with each submission to CBC/Radio-Canada's online communities (except in children and youth-oriented communities). Pseudonyms will no longer be permitted.

By submitting a comment, you accept that CBC has the right to reproduce and publish that comment in whole or in part, in any manner CBC chooses. Please note that CBC does not endorse the opinions expressed in comments. Comments on this story are moderated according to our Submission Guidelines. Comments are welcome while open. We reserve the right to close comments at any time.