Contrary to long-standing beliefs, motion from eye movements helps the brain perceive depth—a finding that could enhance virtual reality [rochester.edu]:
When you go for a walk, how does your brain know the difference between a parked car and a moving car? This seemingly simple distinction is challenging because eye movements, such as the ones we make when watching a car pass by, make even stationary objects move across the retina—motion that has long been thought of as visual "noise" the brain must subtract out.
Now, researchers at the University of Rochester have discovered that instead of being meaningless interference, the visual motion of an image caused by eye movements helps us understand the world. The specific patterns of visual motion created by eye movements are useful to the brain for figuring out how objects move and where they are located in 3D space.
"The conventional idea has been that the brain needs to somehow discount, or subtract off, the image motion that is produced by eye movements, as this motion has been thought to be a nuisance," says Greg DeAngelis, [...] "But we found that the visual motion produced by our eye movements is not just a nuisance variable to be subtracted off; rather, our brains analyze these global patterns of image motion and use this to infer how our eyes have moved relative to the world."
[...] "We show that the brain considers many pieces of information to understand the 3D structure of the world through vision, including the patterns of image motion caused by eye movements," says DeAngelis. "Contrary to conventional ideas, the brain doesn't ignore or suppress image motion produced by eye movement. Instead, it uses this image motion to understand a scene and accurately estimate an object's motion and depth."
This research has important implications for understanding visual perception, which informs how the brain interprets everyday activities like reading and recognizing faces. But it could also provide insight and new applications for visual technologies, such as virtual reality headsets.
"VR headsets don't factor in how the eyes are moving relative to the scene when they compute the images to show to each eye. There may be a stark mismatch between the image motion that is shown to the observer in VR and what the brain is expecting to receive based on the eye movements that the observer is making," says DeAngelis. This could be what causes some people to experience motion sickness while using a VR headset.
Journal Reference: Xu, ZX., Pang, J., Anzai, A. et al. Flexible computation of object motion and depth based on viewing geometry inferred from optic flow. Nat Commun 17, 1092 (2026). https://doi.org/10.1038/s41467-025-67857-4 [doi.org]