Blade Runner and what it means to be human in the age of AI

Blade Runner 2049 is out this week, raising provocative questions about the future of artificial intelligence. We look at what our pop culture fascination with androids and human-like tech says about what makes us human.
What does our pop culture fascination with androids and human-like tech say about what makes us human? (Michelle Parise)

This segment originally aired in October, 2017.

This week, Blade Runner 2049 came out in theatres, and the Spark team is really excited!

We love the original Blade Runner (well not the 1982 theatrical release with its happy ending and missing scenes, rather The Director's Cut (1992) or The Final Cut (2007) which are both soooooooo much better!)

Besides nerding-out over which version of Blade Runner is best and whether the new one stacks up, the Spark team also loves thinking about the way fictional cyborgs, replicants and androids challenge us to think about what it really means to be human.

We can look as far back as Ovid's Pygmalion, a poem about an artist who falls in love with a sculpture...

Art hid with art, so well perform'd the cheat,
It caught the carver with his own deceit:
He knows 'tis madness, yet he must adore,
And still the more he knows it, loves the more:
The flesh, or what so seems, he touches oft,
Which feels so smooth, that he believes it soft.

And then the sculpture comes to life! Whether he likes her as much once she's animate, we'll never know, since Ovid moves on to his next poem in the Metamorphoses.

But our guess is, she became just as tiresome as all the real ladies were to picky ole Pygmalion (read the whole poem, and you'll get the drift!)

Whether it's a sculpture or a cyborg, there's something compelling about the idea of an artificial being that has consciousness. Like cyborg Major Motoko Kusanagi in Ghost in the Shell...

And of course, the Blade Runner replicant Roy Batty...

As a metaphor, these fictional characters help us explore the deep connections we have with technologies. And help satisfy our fascination with bringing objects "to life".

And so the Spark team can't help but circle back to some of the questions we've asked over the years, especially now that context aware devices are becoming more a part of our everyday lives.

Can a future where machines are aware aware be far behind?

When the movie Ex Machina came out in 2015, Spark host Nora Young spoke to philosophy professor Evan Thompson. She wondered if it would help us understand AI better if we could grapple a bit with the mystery of human consciousness.

Evan is the author of Waking, Dreaming, Being: Self and Consciousness in Neuroscience, Meditation and Philosophy. According to Evan, shared experience is key to human consciousness.

"People think the question would be for a robot or android or artificial intelligence, is that system really conscious?" he told Nora, "And that is an important question, but one of the ways we would be affected in our thinking about that would be whether that system or being is able to enter into the kind of shared subjective life that we have with each other."

And although Evan said he believed consciousness is something an AI could have one day theoretically, "I don't think it would be possible simply through fancy programing."

"Consciousness depends on specific biological processes so if one were to create a system that would be conscious, one would have to duplicate the  power of those  complex biological processes, the electrochemical processes we see in the human brain and living cells and organisms."

As Evan said, biological processes and shared experiences are key to human consciousness. They're what set us apart from any artificial intelligence we develop. But does that even matter?

Like the replicants in Blade Runner, or the disembodied voice of a personal assistant like Cortana or Siri, what if just seeming like there's consciousness is enough for us?

There's something a little bit strange when, for example, Siri will crack a joke or say something sassy about its rival AI programs, because we don't expect software to have opinions at all.- Brian Christian, author of "The Most Human Human"

In 2013, we spoke to author Brian Christian about the film Her, the story of a man who falls deeply "in love" with the disembodied voice of a computer operating system.

The premise may seem a little far-fetched, but it's actually not that big a leap considering how we already relate to AI programs that seem human...

"We don't really interact with Siri as though it's a book written by an author. We treat it as its own thing." Brian told Nora in 2013, "So if Siri says something funny, we don't fall in love with the programmer, we fall in love with the program."

The thing is, we're designing systems that we rely on more and more for customized, personalized tasks. Systems that are around us all the time, and that we relate to in human-like ways.

We want to relate to them in human ways. Think of those digital assistants like Siri that we talk to on our phones, or increasingly, in our homes.

As our relationship to our devices becomes more intimate, maybe that world of the movie Her, or Blade Runner 2049, is not as strange as it seems.