As It Happens

'None of us are safe' from fake celebrity porn apps, tech expert warns

A sophisticated app is using AI to make fake porn out of anyone's videos — and it's raising all sorts of legal questions.
Star Wars star Daisy Ridley's face is superimposed onto a porn star's body using FakeApp. (FakeApp)
Listen6:29

Story transcript

A sophisticated app is using artificial intelligence to make fake porn out of anyone's videos — and it's raising all sorts of legal questions.

FakeApp uses an algorithm to create high-quality face-swapped porn, where someone else's face is attached to anothers' body on an existing video — in this case, the porn star. The technology has exploded in recent months.

As Motherboard reports, app users have been making realistic fake porn videos of celebrities like Daisy Ridley, Gal Gadot, and Jenna Fischer from The Office

On Wednesday, the forum on Reddit where FakeApp originated has since been suspended, and online pornography giant Pornhub has banned the app's videos from its site. 

Eric Goldman, a tech law professor at Santa Clara University, has been following the rapid development of this technology and warns "none of us are safe from being a victim of that kind of fakery."

Here's part of his conversation with As It Happens host Carol Off. 

How easy is it to do this? 

The technology is pretty good. The results are quite lifelike and realistic. And the reason why that's significant is because the app is in the hands of amateurs.

This is not technology that's in the hands of the movie studios spending a billion dollars to make movies in a year. This could be you or me using the app to create our own fake video.

Is the pornography-producing industry using this?

There has been a long genre of fake pornography videos, but most the time they were so obviously fake that they were kind of a category that didn't get taken seriously. It was almost like a joke.

But now it might be that the fake videos could be intermixed with the real videos and the ordinary viewer, and even a sophisticated viewer, may not be able to distinguish them.

The deepfakes subreddit — the online forum where FakeApp originated — has been taken down. 'We want to make Reddit a more welcoming environment for all users,' Reddit said. (Reddit)

The people who are producing these pornography videos, are they breaking the law?

The answer is almost certainly yes, but unpacking how they're breaking the law and who would have the right to shut them down is a pretty complicated story.

The real question isn't whether they're breaking the law, but could somebody do anything about it?  And it's possible that irrespective of the slew of laws that might apply here, it may be difficult to actually enforce any of them effectively.

But surely this is something that could end up being used as revenge porn as well, is it not?

Absolutely. The app can be used to create a fake pornography video of a person who is being targeted for revenge or harassment or for some other antisocial purpose. The app would be perfectly suitable for that purpose.

And that would be illegal. Could that someone be charged for revenge porn because of that?

Yes, if we could figure out who created the video and it was somebody who was physically proximate to the victim and was identifiable through their activities, there would be a number of ways in which the laws could make that person pay for their misdeeds.

But there's a whole range of other circumstances where that won't be the case. And it will be much more complicated to figure out who did the video or to figure out who actually had the rights to enforce against the creation of video.

We don't really know if we can believe any video that's ever published because of the possibility that's been manipulated.- Eric Goldman, technology law professor

What are some of the other applications? Because obviously if you can make fake videos for pornography, you can do all kinds of things. You could even put people into news stories that they didn't belong to and then post fake news. So what other things should people be concerned with?

The possibility of creating fake videos in a wide range of circumstances is really unlimited.

So, for example, it's possible to insert someone into a historical moment — kind of like Forrest Gump, if you remember that movie.

That kind of fakery could be used to make someone look like they were at the scene of a crime, or make them look like they were participating in some illegal venture that they actually were not a part of it all.

We don't really know if we can believe any video that's ever published because of the possibility that's been manipulated.

So, in the end, I guess, it muddies the waters, doesn't it? I mean, you could deny that you were actually in a video for real.

It's going to be a big social adjustment for us to realize that we either have to rewire brains to disbelieve everything, or we're going to have to develop some better mechanism of authenticating that the photos and videos are not manipulated in the first instance.

This interview has been edited for length and clarity.