As It Happens

People are 'scared and impressed' by this Nixon moon landing deepfake, says co-creator

Francesca Panetta says she and her colleagues "don't really want to terrify people" with their new video of Richard Nixon delivering an alternate reality moon landing speech.

Video shows Richard Nixon delivering alternate speech written in case the astronauts didn't make it home

A Richard Nixon moon landing deepfake video uses the former U.S. president's resignation speech as its source video. (MIT Center for Advanced Virtuality)

Transcript

Francesca Panetta says she and her colleagues didn't mean to "terrify" people with their new video of Richard Nixon delivering an alternate reality moon landing speech — but she admits viewers are "pretty scared and impressed by it."

The video uses deepfake technology to create the illusion of the former U.S. president reading a sombre speech that was written for him in case astronauts Neil Armstrong and Edwin "Buzz" Aldrin didn't make it home from the Apollo 11 mission. 

"We don't really want to terrify people. We tried not to make a piece of kind of sensationalistic art," Panetta, who co-directed the installation with Halsey Burgund, told As It Happens guest host Gillian Findlay.

"It's an art work, but it's also an education piece around deepfake technologies and the harms of misinformation."

The video is part of an art project by the MIT Center for Advanced Virtuality called In Event of Moon Disaster, and it's currently on display as an immersive installation at the Amsterdam Film Festival. 

  • Watch a clip from In Event of Moon Disaster:
This video uses deepfake technology to create the illusion of former U.S. president Richard Nixon reading an alternate moon landing speech for an installation called In Event of Moon Disaster by the MIT Center for Advanced Virtuality. 0:46

At the exhibit, viewers walk into an area decorated to look like a 1960s living room, complete with patterned wallpaper, a burnt-orange couch and a 1966 television set.

A rattled looking Nixon appears onscreen, sitting at a desk, and delivers tragic news about America's mission to put men on the moon. 

"My fellow Americans," he says. "Fate has ordained that the men who went to the moon to explore in peace will stay on the moon to rest in peace."

The video is not real, but the speech is — though Nixon never delivered it. 

It was written by speechwriter and journalist William Safire in the event that the Apollo 11 astronauts were stranded on the moon, which was a very real possibility at the time.

"It's a really moving elegy, actually," said Panetta, a journalist and fellow at the MIT Center for Advanced Virtuality. "It almost reads like poetry."

In Event of Moon Disaster showcases its Richard Nixon deepfake video on a 1966 TV in setting designed to look like a 1960s American living room. (MIT Center for Advanced Virtuality)

The subject matter is no coincidence, she said. 

"For 40 years there have been various different conspiracy theories [about the moon landing] that have been using those different types of misinformation, from writing books to, you know, fake documentaries," she said.

"So what we're trying to show is ... the latest technique that could be used to kind of take that conspiracy theory one step further."

How did they do it?

The video, says Panetta, was months in the making and a collaboration between sound engineers, a voice actor and AI experts from several different companies and organizations, including Respeecher, Video Dialogue Replacement and Canny AI, the team behind the deepfake video of Facebook CEO Mark Zuckerberg.

 "Publicly, we think that these things are really easy to make," she said. "[But] it's a really complicated process."

Francesca Panetta, a journalist and fellow at the MIT Center for Advanced Virtuality, is the co-director of In Event of Moon Disaster. (MIT Center for Advanced Virtuality)

They sorted through hours of Nixon speeches, she said, cutting them into thousands of one- to three-second clips, each of which a voice actor would listen to and then repeat.

The two recordings combined — Nixon's and the actor's — were used to generate the voice.

"Imagine that for every clip of Nixon, there's a corresponding one of our actor reading that same clip, and what it does is it learns the actor's voice and then anything that the actor then says comes out in Nixon's voice," she said.

"So it's not that the actor's voice itself turns into Richard Nixon's voice. It's that we've made an AI that can say anything in Nixon's voice.

Then there's the video of the Nixon speech itself. The source footage isn't related to the moon landing. Rather, it's from the former president's resignation speech.

"Even though they would be different words, we felt that visually he looked the most moved and we felt that corresponded with this Bill Safire speech," Panetta said. 

"We then really carefully constructed the sound to match the visuals in ... software editing programs."

Why did they do it?

The final product, Panetta says, is both art and education. 

"Deep fake technology is concerning, and that is one of the reasons that we made this piece," she said.

The installation's visitors are each given a newspaper that describes how the video was made, and a postcard with tips on how to spot deepfakes — including keeping an eye out for audio that doesn't sync properly, and  "looking around the edges ... for shadows that seem a bit strange."

In Event of Moon Disaster is a deepfake art project that uses technology and misinformation to re-write history. (MIT Centre for Advanced Virtuality )

In the spring, the team will roll out a more fulsome version of the project that includes other types of alternate universe moon landing media. 

"A number of academics and experts we spoke to said that we shouldn't really just think about deepfakes, that actually the whole landscape of misinformation is really important to consider," Panetta said. "So cheap fakes to deepfakes."


Written by Sheena Goodyear. Interview produced by Morgan Passi. 

Comments

To encourage thoughtful and respectful conversations, first and last names will appear with each submission to CBC/Radio-Canada's online communities (except in children and youth-oriented communities). Pseudonyms will no longer be permitted.

By submitting a comment, you accept that CBC has the right to reproduce and publish that comment in whole or in part, in any manner CBC chooses. Please note that CBC does not endorse the opinions expressed in comments. Comments on this story are moderated according to our Submission Guidelines. Comments are welcome while open. We reserve the right to close comments at any time.

now