Medicine Technology 🌱 Environment Space Energy Physics Engineering Social Science Earth Science Science
Medicine 2026-03-19

Consonant chords light up the brain's social circuits during eye contact

Yale researchers used portable brain imaging to show that familiar harmonic progressions amplify neural activity tied to interpersonal bonding

AZA Allsop is a jazz keyboardist, a vocalist, and an assistant professor of psychiatry at Yale School of Medicine. Five years ago, he came across work by his colleague Joy Hirsch showing that group drumming changes social behavior and brain activity simultaneously. He cold-called her. The collaboration that followed has now produced a study that maps, in real time, how specific musical structures reshape the neural machinery of human connection.

A keyboardist and a ballroom dancer walk into a lab

Hirsch, the Elizabeth Mears and House Jameson Professor of Psychiatry at Yale, is a neuroscientist and a nationally competitive ballroom dancer. Between Allsop's jazz training and Hirsch's kinesthetic relationship to rhythm, the pair brought an unusual degree of embodied musical knowledge to a brain-imaging project. That mattered. Designing the right auditory stimulus — choosing which chords, in what order, at what tempo — required someone who understood music from the inside.

Their hypothesis was specific: certain chord progressions show up so frequently in Western pop, jazz, and folk music not merely because of cultural convention, but because they do something measurable to human physiology. The progressions feel right. But why?

Face to face, eye to eye, chord by chord

The experimental setup eliminated the usual isolation of brain-imaging studies. Instead of sliding subjects into an MRI tube alone, the researchers seated pairs of participants across a table from one another. Each pair was instructed to maintain direct eye contact — face to face, no screens, no intermediary.

During some trials, the pair heard consonant chord progressions: harmonically predictable, pleasant sequences common across Western musical traditions. During others, they heard scrambled versions of the same notes — identical pitches rearranged into sequences that lacked harmonic logic. Control trials used silence.

Brain activity was measured with functional near-infrared spectroscopy (fNIRS), a portable imaging technique that tracks blood-flow changes in the cortex. Unlike MRI, fNIRS lets subjects sit upright, move naturally, and — critically for this study — interact with another person in real time.

The social brain responds to harmony

When consonant chord progressions played, activity spiked in brain regions associated with social perception, emotional processing, and interpersonal connection. The scrambled-note condition did not produce the same effect. Neither did silence. Something about the harmonic structure — not just the presence of sound — was driving the neural response.

Participants also reported subjectively feeling more connected to their partner during the consonant-chord trials. And here's what caught even the researchers off guard: the strength of that subjective feeling correlated directly with the magnitude of activation in those specific brain regions. People who reported the strongest sense of connection showed the most activity in the social-perception areas. The link between neural signal and felt experience was unusually tight.

This wasn't a case of music making people generally aroused or attentive. The effect was selective. It targeted the brain's social circuitry — the regions that help us read faces, interpret intentions, and feel bonded to others.

Why the common progression matters

Allsop chose the chord progression deliberately. It's a sequence familiar to anyone who has listened to pop radio, a jazz standard, or a hymn. The question was whether its ubiquity reflects something deeper than convention — whether the progression persists because it primes us for social engagement in a way that arbitrary note combinations do not.

The data suggest it does. Consonant progressions appear to prepare the brain for social interaction, enhancing the neural systems that facilitate understanding and responding to other people. That would help explain why music shows up so consistently at the moments when humans come together: weddings, funerals, worship, protest marches, sporting events. It may not just accompany bonding. It may actively enable it.

From ritual to therapy

The implications extend beyond explaining why concerts feel connective. Music-based interventions already exist for conditions marked by social disconnection — autism spectrum disorder, social anxiety, depression with isolation. But these therapies have largely relied on clinical intuition about which musical elements help and why.

This study offers a potential mechanism. If consonant harmonic progressions specifically amplify the neural systems that support social perception, clinicians could design music interventions with greater precision — selecting stimuli that target the right circuits rather than relying on general musical exposure.

That's still speculative. The study used neurotypical adult participants in a controlled lab setting. Whether the same neural amplification occurs in people with autism, social anxiety, or other conditions that impair social processing remains untested. The sample was also drawn from a Western cultural context saturated with the chord progressions used in the experiment. Listeners raised on different harmonic traditions might respond differently.

What a chord progression can and cannot do

It's worth being precise about the boundaries of this finding. The study shows that a specific type of musical stimulus enhances activity in social brain regions during face-to-face interaction. It does not show that music creates social bonds where none would otherwise form, nor does it demonstrate lasting changes in social behavior or relationship quality. The measurements captured a moment — a neural state during listening — not a permanent shift.

The fNIRS technique, while ideal for naturalistic social interaction, measures only cortical surface activity. Deeper brain structures involved in emotional processing — the amygdala, the ventral striatum — remain invisible to it. The full neural story is likely more complex than what surface imaging can reveal.

Still, the core finding is striking: among all the sounds the brain might respond to during social interaction, structured harmonic progressions activate the specific regions most involved in perceiving and connecting with others. Music doesn't just set a mood. At the level of neural circuits, it tunes the brain for togetherness.

Source: Published in The Journal of Neuroscience (March 2026). Research by AZA Allsop (first author), Dash Watts (co-first author), Adam Noah, Xian Zhang, Simone Compton, and Joy Hirsch (senior author), Yale School of Medicine. DOI: 10.1523/JNEUROSCI.1116-25.2026