Quirks & Quarks

AI is now learning to do things it hasn't been taught

Deep learning AI network develops a sense of numbers by mimicking how humans process visual information

Deep learning AI network develops a sense of numbers after learning visual tasks

An artificial intelligent deep learning network trained on classifying images evolved its own sense of numbers (Peshkova / Shutterstock)

Deep learning neural networks were designed to mimic how humans learn. And now a team of researchers has discovered these systems can develop a sense of numbers — just like babies and some animals can — without ever being taught.

In the last few years, researchers around the world have made remarkable strides with AI learning algorithms by training them to do particular tasks, like recognizing certain kinds of images or patterns in data. But while these systems have become extremely proficient at the tasks they are trained on, they can't do anything else.

"The surprising result was that this network, even even though we never trained it to discriminate the number of objects, later was also able to tell us the number of objects in a visual scene," said Andreas Nieder, the senior author on the study and professor of animal physiology at the University of Tübingen, Germany.

These neurons behaved exactly the same way as real neurons in the brains of humans or of animals.- Andreas Nieder, University of Tübingen

How this AI system developed a 'sense of numbers'

Nieder and his colleagues trained their deep learning neural network to classify images by exposing it to thousands of images and then having it learn to classify them into specific groups, like animals, or types of vehicles.

Once the system became a proficient image classifier, they fed it images that had varying numbers of simple dots in them.

"This network was instantaneously able to tell us, at least in an approximate way, how many dots were in these displays," said Nieder.

The AI system learned like we do

"The second very surprising finding was that when we then looked to find out how these artificial neurons of the network would represent the number of objects in a set, these neurons behaved exactly the same way as real neurons in the brains of humans or of animals," added Nieder.

We process visual information in layered heirarchy. It starts with simple raw data in the form of neural signals in the retina, which then gets processed through other neural systems in the brain so that eventually we are able to take in and understand the entire visual scene in front of us.

When Nieder looked at the artificial neurons in their deep learning network, they found multiple layers of complexity, and a similar type of hierarchical processing.

"And this is kind of mimicking the visual system that we have in our own brain."

Babies develop a sense of numbers long before they can count. Scientist think this ability to spot differences in numbers of objects is an innate ability that develops from the hard-wiring in our visual processing area. (JAMIE MALBEUF/CBC)

What this says about our innate number sense

Human babies and some smart animals like monkeys or crows can't count, but they can estimate the number of objects in a visual scene. It was this ability that Nieder and his team discovered in their deep learning neural network, despite the fact that it had only been trained to classify images.

This suggests that basic numeracy skills that we and other intelligent animals possess might be hard-wired into the way vision works.


To encourage thoughtful and respectful conversations, first and last names will appear with each submission to CBC/Radio-Canada's online communities (except in children and youth-oriented communities). Pseudonyms will no longer be permitted.

By submitting a comment, you accept that CBC has the right to reproduce and publish that comment in whole or in part, in any manner CBC chooses. Please note that CBC does not endorse the opinions expressed in comments. Comments on this story are moderated according to our Submission Guidelines. Comments are welcome while open. We reserve the right to close comments at any time.

Become a CBC Member

Join the conversation  Create account

Already have an account?