Yale researchers discover gaps between in-person and online facial processing in the brain
A new study at the School of Medicine’s Brain Function Laboratory found that when processing faces on Zoom versus in-person, behavioral indicators and neural activity change.
Eric Wang, Senior Photographer
Katherine Dick ’27 said that during 2020, “Zoom-school” negatively impacted not only her learning but also the connections between her and her peers.
In the rare moments when students and teachers turned their cameras on, she said, it proved difficult to recognize subtle changes in facial expressions that she would normally never miss in person.
“I remember not feeling connected to my classmates or teachers in the same way I used to be,” Dick reflected.
A new study from the Yale School of Medicine might lend Dick’s sentiment a scientific backing: our brains do not process in-person and online faces in the same way.
Published last month in the scientific journal Imaging Neuroscience, the study found that neural activity in the areas of the brain involved in facial processing significantly decreased during face-to-face interactions on Zoom and similar platforms compared to in-person interactions.
Contrary to accepted models, referenced in the study, of how facial processing operates in the brain — which predict no differences between Zoom and in-person interactions — researchers at the School of Medicine’s Brain Function Laboratory demonstrated that the way our brains interpret these stimuli happen in two separate ways.
According to the researchers, the importance of human connection might be what makes the difference.
“A single brain is only one half of the most fundamental [social] unit,” said Joy Hirsch, professor of psychiatry, professor of comparative medicine and neuroscience and senior author of the study. “I think that our brains are designed to connect with other brains.”
The researchers’ interest in the brain’s dyadic activity, or activity relating to interactions between two people, during in-person and online settings, was not a result of the COVID-19 pandemic.
Instead, Hirsch said, she and other members of her lab — including Nan Zhao, a PhD student from China and the paper’s first author — began their work on the potentially groundbreaking research question before the pandemic even began.
In 2020, as the looming COVID-19 pandemic threatened a lab shutdown and Zhao’s student visa neared closer to its expiration date, Hirsch said, the researchers banded together to finish all the necessary experimental trials as quickly as possible.
“We realized how important this question was before the pandemic,” Hirsch said. “And the pandemic actually slowed us down a bit, but helped us on the relevance frame. The study was more relevant after the pandemic, but that was not anticipated.”
Hirsch said that the study was made possible by a new kind of neuroimaging technology developed in-lab known as functional near-infrared spectroscopy, which allows two brains to be imaged at the same time.
Unlike most other brain imaging done in neuroscience research, which only allows for one brain to be imaged at a time while a patient completes specific tasks, Hirsch explained, fNIRS can simultaneously compare each brain’s activity to the others’.
The new technology allows scientists to examine how the brain behaves while interacting with other people.
“Most of our behaviors in the everyday world are behaviors that relate to other people. We talk to each other, we share things, we seek each other out,” Hirsch said. “So the brain is mostly designed for interactions with other people, but we know very little about that because we haven’t been able to study it.”
Using this technology, Hirsch and other members of the lab have been able to conduct multiple studies to gain new perspectives on facial and language processing — perspectives that allow scientists to begin to look at the inner workings of the brain in a new way.
Eye contact and attention
Hirsch said it is widely accepted in the field of neuroscience that most facial processing occurs by the brain analyzing different facial features, with small patches of the brain’s cortex each dedicated to features like the hair, eyes, nose and mouth. The strategy, according to 2019 research from Emory University, allows the brain to identify faces in a wide variety of contexts.
But under this framework, Hirsch said, there should not have been any difference between the neural activity observed during face-to-face interactions in person or on Zoom. Hirsch and her colleagues speculated that the opposite might be the case.
“We set out to test the hypothesis, knowing full well that it was wrong but not knowing how, and assuming that the answer to how would be extremely informative in the next stages,” she said.
The researchers found that behavioral eye-tracking measures such as pupil dilation and the length of time the eyes lingered on a face increased when two people were interacting in person. The dorsal stream, a stream of visual processing systems related to attention and eye contact, was also found to be more active during real-life encounters than on Zoom.
This finding is consistent with the idea that the in-person stimulus environment for facial processing is richer than that presented to the brain on Zoom, Hirsch said. The differences could come down to a variety of factors, including eye contact and reacting to facial expressions.
“One of the things that one can assume [is that] the eye-to-eye contact conditions, which are very important in personal interactions, are altered, because here we’ve got a slanted camera,” Hirsch said. “So that reciprocity that would go on between us normally in this very fine-to-micro-movement scenario in real life probably doesn’t go on in the online encounter.”
Amy Arnsten, Albert E. Kent professor of neuroscience and professor of psychology, agreed with the hypothesis put forward by Hirsch.
In an email to the News, Arnsten noted the importance of eye movements as critical social signals in humans and other primates. Missing any one of them, she said, could contribute to differences in how we process each others’ faces online.
“It makes sense that there is much less of this on Zoom, as we are not really looking at each other, and people rarely even look directly into the camera,” Arnsten wrote. “Thus, we miss all the small but powerful signals about how a person is responding to us. It makes sense that we would not feel as connected under these conditions.”
Trouble publishing
According to Hirsch, the study has important implications — not just for experts in the field of cognitive neuroscience but also for society as a whole.
The concept that the brain processes faces using additional systems that scientists do not currently understand, Hirsch said, is new.
As a result, Hirsch said, their proposal was not initially well-received. When she and her colleagues first tried to publish the study, she said they faced significant barriers, which she attributed to a hesitancy within the field to substantially modify an entire school of thought.
“Believe it or not, we had a lot of trouble publishing it,” Hirsch said. “And the reason is that the basic neuroscience community, I think, just wasn’t ready to realize that there’s this whole new domain of neural processes that go on.”
When the paper was published, it achieved what, to Hirsch, seemed to be immediate and widespread attention — not just from experts in the field but from readers from across the world.
Hirsch said that the popular response to the research might be due to how universal the experience of social isolation was during the COVID-19 pandemic, and how integrated Zoom-like platforms are in settings ranging from schools to legal proceedings and telehealth medical appointments.
She recommended that the scientific community take action to address the knowledge gap and advance neuroscience’s understanding of interpersonal interaction as a whole.
“Our education, our medicine, our learning, our business transactions, you know, our law, whatever it is – all those things need to be investigated,” she said.
The Yale Department of Psychiatry is located at 300 George St.