Quirks and Quarks·Quirks & Quarks

Your baby can recognize the shapes of faces before it's even born

Scientists can now decipher the neural signals in the brain that codes for faces.
Baby staring adoringly at his mother. (Christopher Michel)

When babies are born, one of the things parents tend to get really excited about is how babies like to stare at their parents' faces. And that's not just the imagination of overly adoring parents. 

Science has shown that from the moment of birth, infants are attracted to faces.  They show a strong preference for faces — over all other kinds of shapes and objects. 

This fascination with faces only intensifies as humans age.  In fact, our brain seems to be specifically wired to recognize and discriminate human faces in a way that's different from almost any other object.  It's why people might forget names — but so many of us never forget a face.

Two scientists help us get a better understanding of our affinity to faces.

Dr. Vincent Reid is a professor of psychology at Lancaster University in the United Kingdom. He has shown that infants are attracted to face-like shapes even before they are born — in the womb, as early as 34 weeks after conception.

Dr. Doris Tsao is professor of biology at the California Institute of Technology and an investigator at the Howard Hughes Medical Institute in Pasadena, California. She's cracked the brain's neural code for faces.

This interview has been edited for length and clarity.

Bob McDonald: How do you study what kind of visual preferences a fetus shows in the womb?

Dr. Vincent Reid: Well what we did is we took the same sorts of studies that we do with newborns and then we applied that kind of research to the fetus. So what you do is you create an image that looks like a face. Then you project it through to the fetus through the uterine wall. And then what you do is you move that slightly to the side and see if the fetus will track that particular object. So we found that they were more interested in looking towards a particular object that was shaped like a face than one that was upside down. It's the same thing upside down, but it doesn't look like a face when you invert it.

BM: How did you project the image of a face to the fetus? 

VR: Well what we did is we built up a model, in terms of how the maternal tissue would change anything we presented to the fetus. Then we tried to figure out what it would look like from the point of view of the fetus. So maternal tissue does two things. The first thing it does is it blocks light. The other is that it makes it diffuse out, so it makes it go blurry. So what we do is we just made the stimulus very small, but defined, and that blurred out to the point where it looked like three dots.

When you have a newborn, if you showed them three dots - two representing the eyes and one the mouth, they find that's more interesting than the same three dots upside down. What's interesting is that this two dots that represent the eyes and one representing the mouth is something that is known as a face-like. So obviously it's not a face, but when newborns are born their visual focus is actually quite poor, which means that when they do look at someone's face it's all blurry. And it kind of blurs into those sorts of shapes anyway.

What we know is that a newborn prefers to look at objects when there's more information in the upper than the lower visual field. They like to look at things like a capital T, but if you turned it upside down, that's not so interesting. What we found is that for the dots that look like a face, they would move their head and they would track more than if we did the same thing with the dots upside down.

This is a 4-D ultrasound of a fetus tracking the stimulus (Kirsty Dunn & Vincent Reid)

BM: Why do you think they have that preference?

VR: So there were three options before we did the study. One was that after birth, maybe some experience changed the way that the visual system was sensitive to information in the upper visual field. We've ruled that out now. So that leaves two things.

One option might be that there's an innate mechanism. It could be this is just in our genes. It's just what we are and these things emerge as we develop. Now the argument for this is that this is adaptive. A baby that is born, if they look at you, you treat them like a social partner. So you want to play with them and you want to treat them as if they are engaging with you.

The alternative is that there's something in the prenatal environment. I think it's slightly less biased to look at faces as triggered by an exposure to patterned light in the womb. So it's possible, for example, that the maternal rib cage could introduce a variation in the light fields. So that could be the top half of whatever the fetus is looking at when contrasted to the lower half of the abdomen. What that means is that you have variation in the visual environment for the fetus, which may drive this bias that they then produce.

BM: You know it's interesting this see faces right side out because I've talked to astronauts in space. And they say the same thing because in in weightlessness, you can be in any position. You don't have to be upright. You can be floating around. And they find that if they want to have a conversation with someone, if they're upside down, they're not comfortable with that. They prefer to orient themselves so both their faces are upright even though they don't have to.

VR: Well it makes perfect sense doesn't it. Because that's how we see faces. It's just we're experts in looking at faces. And you can't really tell what someone is doing with their face unless it's in the right orientation.

As Dr. Reid stated, his study only underlines the fact that humans are experts at faces, which is probably a reflection of our intensely social nature. Just think about it, we don't just see faces on people. We see them in tree bark, in vague patterns in clouds, in the strangest caricatures.  And that's because we really are wired to see faces. It happens instantaneously in the brain. It goes beyond recognizing generic faces. We're almost all incredibly experts at discriminating individual differences, so we can instantly recognize that , even from a bad angle - "Yes, I know THAT face."

A team from Caltech University, led by Dr. Doris Tsao — a professor of biology, has cracked the brain code for faces. By using macaques monkeys, they inserted electrodes in the monkey's brains in areas known to code for faces, to get a reading of the neurons firing. By systematically doing that with faces and parts of faces over many years, they've come up with a way to decipher the neural signals, to eerily reconstruct images of the faces the monkeys were looking at.

This figure shows eight different real faces that were presented to a monkey, together with reconstructions made by analyzing electrical activity from 205 neurons recorded while the monkey was viewing the faces. (Doris Tsao)

The way Dr. Tsao did this was by developing a system of 50 points, shown as white dots in the video below, to be able to translate the neural signal into a map of facial features, which allowed them to reconstruct the faces the monkeys saw.