For the first time, adults are able to experience how much infants can visually perceive about the world. Thanks to researchers at the Institute of Psychology in collaboration with colleagues at the University of Uppsala and Eclipse Optics in Stockholm, Sweden, the combination of modern technology and a 15-year-old idea has yielded some important results about the vision of newborns.

Svein Magnussen, from the Institute of Psychology, had previously conducted studies on human visual perception over a decade ago. He and his colleagues found themselves discussing testing if newborn infants could perceive facial expressions chiefly because the faces were in motion.

“Previously, when researchers have tried to estimate exactly what a newborn baby sees, they have invariably used still photos,” Magnussen said in a press release. “But the real world is dynamic. Our idea was to use images in motion.”

There was no way to test their idea at the time — they lacked both equipment and technological competence — but they recently revisited their theories to conduct fresh research.

The new study aimed to fill a gap in our knowledge about infant vision. Conducted mostly in the 1980s, previous research had discovered a great deal about infants’ spatial resolution and contrast sensitivity. By showing babies figures with varying black and white stripes against a plain gray background, researchers were able to determine the precise level of contrast and spatial resolution needed to grab the infants’ attention.

What was missing, though, was any practical application. These studies did little to indicate whether newborns could recognize the facial expression of someone in front of them. Magnussen and his team utilized the previous research to create animations of human faces displaying several emotions. They matched the contrast and spatial resolution of the animations to what a newborn would see, effectively creating a baby-vision filter.

Adult participants were then shown the animations at different distances, and their accuracy at identifying the emotions was recorded. Researchers’ reasoning was that if an adult could not recognize an emotion in the filtered distance, a newborn most certainly would not be able to.

The adults correctly identified three out of four faces at a distance of 30 centimeters, but their accuracy dwindled to what could be random guessing by 120 centimeters. Researchers determined that newborns can see the difference between emotions, but only at the short distance of a mother’s face while she nurses her baby.

“It’s important to remember that we have only investigated what the newborn infant can actually see, not whether they are able to make sense of it,” Magnussen said in the press release.

This is an important breakthrough in the field, but it also causes a few more questions to spring up. As for further research, Magnussen says that will be left to others.

“All of us behind this study are really involved in different fields of research now,” Magnussen said in a press release. “Our position is: Now a piece of the foundation is in place. If anyone else wants to follow up, that’s up to them.”

Source: Magnussen S, et al. Simulating newborn face perception. Journal of Vision. 2015.