As It Happens

Dragon cats and killer orchids: An AI's vision of the future based on 2020 headlines

From mysterious monoliths to glowing marsupials and murder hornets, As It Happens covered some seriously strange stories in 2020. But a list of computer-generated headlines inspired by the past year proves robots probably aren't coming for our jobs just yet.

Janelle Shane, research scientist behind the AI Weirdness blog, trained the algorithm how to compute the news

Based on a sample of 2020 headlines, a computer algorithm predicts killer orchids in our future. (Rick Bremness/CBC)

From mysterious monoliths to glowing marsupials and murder hornets, 2020 has brought us some pretty strange stories. But a list of computer-generated headlines inspired by the past year are even weirder. 

"Massive radioactive sinkhole continues to grow in Russia," "Mysterious Origin of Monster Deep-Sea Toads Solved," and "A sassy tardigrade previews new Doctor Who," are just some of the examples an algorithm spat out after being fed a series of actual 2020 headlines. 

"They run the gamut from incredibly dark ... [to] the opposite end of the spectrum," Janelle Shane, the research scientist who came up with the idea and trained the algorithm how to compute headlines, told As It Happens host Carol Off. 

"There's one that goes, 'From deep in the Earth, darkness boils to the surface,' which is actually very 2020 in a way."

Shane looks at the "sometimes hilarious, sometimes unsettling" ways in which algorithms err. 

Simply put, artificial intelligence learns from experience and adjusts to new data in order to perform human-like tasks. Machine-learning algorithms have a reputation for being smart, but they can still get things wrong.

The algorithm in question, GPT-3, is known for mimicking human language

But GPT-3's training data was all collected before 2019 — which means it knew nothing about 2020 until Shane fed it a list of headlines from this year. 

"What I really like is that it has no indication whether these are real headlines, whether these are tabloid headlines, and you can see that it is a little bit unsure," she said.

Some other headlines that were generated include, "What are 'dragon cats' and why they are getting hyped?" "'Lost NASA space shuttle repair robot comes back to life after four decades in complete darkness" and "When Killer Orchids Attack: How the Deadly Corpse Orchid Is Turning Up in U.S. Backyards."

Janelle Shane (Janelle Shane/Twitter)

None of these stories took place in 2020, but the headlines can sound somewhat convincing given the incredible year we've had. 

There are also, "Reports of a '10-foot tall penguin' roaming about on the Family Islands in the Solomon Islands are investigated," "Proof that a hellhound is living at Los Angeles Airport has been provided in the photos below" and "First naked bogman has been found out walking the great British countryside," which are more obviously false.

According to the research scientist, the real headlines she fed into the algorithm were about animals, science and nature. Some of those real stories, like the one about the murder hornets, were also covered by As It Happens this year.

But with all the information logged on the internet, Shane says her algorithm lacks the ability to determine real-world events.

"AI doesn't understand … the context of what it's doing and how the world works … what's realistic and what's not," she said.

"And so you get the very weird stuff suggested along with the real stuff and the filtering is kind of spotty."

According to Shane, algorithms perform well when given tasks that are clearly and narrowly defined, but less so at solving problems that require creativity and flexibility.

"These AIs are really depending on their training data. They can only replicate the situations of the past. When they're doing all their predictions, they're predicting past behaviour," she said.

"And so we can see that come up when we're trying to have these algorithms predict who should be hired or what scores students will get on their exams. They kind of predict that things will never change. Things will stay the same."

So while the algorithm was able to create 2020-esque headlines, they don't give the full picture of the extent of this year.  The algorithm lives in a world where it can't understand the complexity of our human experience.

"If we don't tell it about 2020 specifically, then it doesn't have 2020 to draw on when it's trying to figure out if these are real headlines or not."

Written by Mehek Mazhar. Interview produced by Matt Meuse.


Add some “good” to your morning and evening.

A variety of newsletters you'll love, delivered straight to you.

Sign up now


To encourage thoughtful and respectful conversations, first and last names will appear with each submission to CBC/Radio-Canada's online communities (except in children and youth-oriented communities). Pseudonyms will no longer be permitted.

By submitting a comment, you accept that CBC has the right to reproduce and publish that comment in whole or in part, in any manner CBC chooses. Please note that CBC does not endorse the opinions expressed in comments. Comments on this story are moderated according to our Submission Guidelines. Comments are welcome while open. We reserve the right to close comments at any time.

Become a CBC Member

Join the conversation  Create account

Already have an account?