Male Fruit Flies Use a Female's Eyes - Not Just Her Movement - to Guide Courtship Behavior
Male vinegar flies, Drosophila melanogaster, perform an elaborate courtship sequence when pursuing females: they orient toward her, tap her with their legs, chase her, and produce a species-specific song by vibrating their wings. These behaviors have been studied for decades as a window into how brains translate sensory information - pheromones, sound, vision - into complex social behavior. Vision's role in this sequence has generally been understood as simple: spot the female, track her movement, follow her.
A study published in G3: Genes, Genomes, Genetics from Yehuda Ben-Shahar's lab at Washington University in St. Louis challenges that picture. Using a computer vision and machine learning approach, the research shows that male flies use surprisingly specific visual cues - particularly the female's eyes - to determine her body axis, and that this spatial recognition shapes which courtship behaviors are directed toward which part of her body.
The Experiment
Because males spend most of their courtship time chasing moving females, it is extremely difficult to track exactly where different behaviors are aimed relative to the female's body. Ben-Shahar, together with neuroscience graduate students Ross McKinney and Christian Monroy Hernandez, developed a simplified courtship paradigm that addressed this problem: they placed males with a stationary female, then used automated behavioral tracking to map male courtship behaviors relative to specific regions of the female's body.
This approach - pairing high-resolution video tracking with machine-learning classifiers trained to identify specific behaviors - allowed the team to generate precise spatial data that manual scoring cannot produce. "Using a trainable computer algorithm for the analyses provided more robust data by reducing errors due to human observation biases," Ben-Shahar said. The framework represents the first published use of this automated spatial analysis approach for Drosophila courtship behavior.
What the Flies Actually Do
The data showed that males consistently bias certain behaviors toward either the anterior or posterior half of the female. Song production, for example, was preferentially directed toward the female's head end. This spatial precision depended on visual input: when that visual information was removed, males lost their ability to consistently target behaviors to the correct body region.
The visual feature driving this spatial recognition turned out to be the female's eyes. Because the eyes are located at the front of the female's head, they provide a reliable anatomical marker for orienting the male's courtship sequence. "At the center of this finding are the female's eyes, which mark the front of the female. This recognition increases the likelihood that certain courtship behaviors, for example the song, will be directed toward the female's head," Ben-Shahar said.
"Our interpretation is that males use the visual recognition of specific anatomical features of the female as triggers for releasing specific behavior at the right location and distance from the female," he said. Rather than being a simple on-off switch, courtship appears to be a continuously modulated behavioral program, shaped moment to moment by sensory cues.
Neural Architecture of Spatial Recognition
The study also investigated which neural circuits underlie this spatial recognition. By analyzing the contributions of different visual projection neurons, the researchers found that the female's body orientation is not computed through a single specialized neural pathway. Instead, it emerges from multiple independent neuron populations working in parallel. This distributed architecture may help explain why the spatial recognition is robust even under variable visual conditions.
The research has implications beyond fly courtship. Drosophila melanogaster is a central model organism in neuroscience and genetics, and tools that make behavioral observation more accurate and scalable can accelerate research across many questions about brain function and sensory systems. "We would like to expand it to studies of other behaviors that can be measured in two-dimensional spaces," Ben-Shahar said. Future development of a three-dimensional tracking system would allow analysis of more complex interactions.
The study was supported by a Howard A. Schneiderman Graduate Fellowship from WashU, the Genome Analysis Training Program at WashU, NIH grants NS089834 and ES025991, and NSF grants 1545778 and 1707221.