Medicine Technology 🌱 Environment Space Energy Physics Engineering Social Science Earth Science Science
Medicine 2026-03-13 3 min read

Your brain predicts where things should be after every eye movement - and it is 94% accurate

Afterimages reveal that the brain's internal compass for visual stability consistently undershoots the true distance the eyes travel

The bright blob you see after staring at a lamp is not just an annoyance. It is a window into one of the most sophisticated computations your brain performs every waking second.

Your eyes jump several times per second in rapid movements called saccades. Each jump shifts the entire image on your retina. By rights, the world should appear to lurch every time you glance from your coffee cup to the window. It does not. The brain compensates, and it does so with remarkable precision - but not perfectly.

The afterimage trick

A team from the Cluster of Excellence Science of Intelligence in Berlin - Richard Schweitzer, Thomas Seel, Jorg Raisch, and Martin Rolfs - used afterimages to isolate the brain's internal signals for tracking eye movements. The study was published in Science Advances.

The experiment was elegantly simple. Participants sat in complete darkness. A bright flash created an afterimage on the retina. Then a second light appeared at a different location, and participants looked toward it. Once the afterimage became visible again, brief probe lights appeared at various positions, and participants reported whether the afterimage seemed to the left, right, or directly aligned with the probe.

Because an afterimage is physically fixed on the retina, it cannot move when the eyes move. Yet participants consistently perceived the afterimage as having shifted position - following their gaze. By measuring exactly how far the afterimage appeared to move relative to how far the eyes actually traveled, the researchers could quantify the accuracy of the brain's internal prediction.

94% accuracy, 6% mystery

The perceived shift of the afterimage reached about 94% of the actual eye movement. That is remarkably close, but the consistent 6% undershoot - called hypometria - was not random noise. It held across individuals, across different directions, and across different magnitudes of eye movement. This points to a systematic bias in the brain's prediction, not a sloppy one.

The source of that prediction appears to be the efference copy - an internal duplicate of the motor command sent to the eye muscles. Rather than waiting for new visual information after each saccade, the brain uses this copy to anticipate the consequences of its own movement. It effectively tells itself how far the eyes just moved and adjusts the perceived visual scene accordingly.

The researchers tested whether post-movement visual feedback altered where participants perceived the afterimage. In some trials, the target light remained visible after the eye landed; in others, it was deliberately shifted to create misleading feedback. Neither manipulation changed the perceived afterimage position. The brain was committing to its prediction before checking the evidence.

When eye movements change, predictions adapt

The team pushed further by exploiting a phenomenon called saccadic adaptation. When the eyes consistently miss their targets - due to fatigue or experimental manipulation - people gradually adjust how far they move their eyes. The researchers found that as participants' saccades became shorter through adaptation, the perceived shift of the afterimage shortened proportionally.

But the 6% undershoot remained, whether saccades were adapted or not. The bias appears baked into the system.

Why the error might be a feature

Natural saccades tend to fall slightly short of their targets. If the brain has learned, through a lifetime of visual experience, that saccades typically undershoot, it would make sense for the internal model to expect a slightly smaller visual shift than the eyes actually produce. In a world where objects stay put between glances, a slightly conservative prediction would be correctable by incoming visual information. A prediction that consistently overshot might cause more perceptual disruption.

In other words, the 6% error may not be a bug. It may be the brain's way of playing it safe in a visual environment that offers constant feedback for correction.

Limits and applications

The experiments were conducted in complete darkness - the opposite of normal seeing, where a rich visual scene provides constant error-correction signals. Whether the same 94% figure holds in well-lit, cluttered environments is an open question. The sample sizes were modest, and the study examined only horizontal and vertical saccades, not the oblique movements common in natural viewing.

Still, the findings have implications beyond basic neuroscience. Virtual reality systems need accurate models of how the brain tracks eye movements to reduce motion sickness. Robotics systems that pair cameras with motor commands face analogous prediction problems. And clinicians studying eye-movement disorders - from nystagmus to cerebellar disease - may find that afterimage-based testing offers a precise, noninvasive way to assess the brain's internal model of its own movements.

Source: Schweitzer, Seel, Raisch, and Rolfs, "High-fidelity but hypometric spatial localization of afterimages across saccades," Science Advances, 2026. Cluster of Excellence Science of Intelligence, Berlin.