From insects to fish to mammals: Active vision in non-primate organisms

Symposium: Friday, May 16, 2025, 1:00 – 3:00 pm, Talk Room 1

Organizers: Lisa Kroell1, Lisa Fenk1; 1Max Planck Institute for Biological Intelligence
Presenters: Lisa Fenk, Basil el Jundi, Lisa Bauer, Eva Naumann, Jason Kerr, Philip Parker

Certain questions have occupied active vision researchers for decades: How is visual perception modulated yet ultimately undisturbed by frequent movements of the eye, head and body? How is external world motion distinguished from motion signals caused by effector movements? And what functions may eye movements serve beyond the foveation of relevant information? Within the field of human and non-human primate vision, these questions are extensively discussed by the VSS community. Among the remaining 99.98% of living animal species, however, a universe of active visual behaviors waits to be discovered. While strikingly similar questions arise across species, many non-primate organisms face a markedly different set of preconditions. They might, for instance, possess uniform acuity across a broad visual field, need to account for the visual consequences of both gait and flight, or even be able to move their eyes independently of one another. Simultaneously, the study of non-primate organisms opens up a world of methodological possibilities: Depending on the species, neuronal activity can be controlled using optogenetics, neurophysiological recordings can be performed during unconstrained behavior and, due to the reduced complexity of the underlying neural systems, visual circuits can be dissected in remarkable detail. We gathered six active vision specialists from five international institutions who harness this exceptional toolset to investigate visual perception in behaving insects, fish and non-primate mammals. Our speakers—all of them first-time VSS attendees—represent a variety of career stages, from graduate student to early independent researcher to full professor. Despite their use of diverse model systems, they pursue a common aim: understanding how movements of the eyes, paws, wings, tails or fins shape and even support vision. First, Lisa Fenk will introduce the audience to the recently discovered retinal movements in fruit flies. Through video-based eye tracking and whole-cell patch clamp recording, she investigates the properties of these movements, as well as the consequences of spontaneous movements for visual processing in general. Next, Basil el Jundi will demonstrate that, similar to vertebrates, spatial navigation in dung beetles and Monarch butterflies is accomplished by neurons coding for head and goal direction. Lisa Bauer will subsequently venture into underwater habitats and describe how spontaneous and stimulus-induced saccadic eye movements in zebrafish alter the activity of visual neurons. Eva Naumann will follow up by showcasing the vast potential of combining several sophisticated optical techniques: Through two-photon microscopy and 3D holographic optogenetic photostimulation, she pinpoints how visual signals are translated to motor commands during the zebrafish optomotor response. Next, Jason Kerr will demonstrate that freely pursuing ferrets execute stabilizing eye movements as well as saccades that align optic flow-fields with the high-resolution area centralis and with the intended direction of travel. Philip Parker will conclude by showing that neurons in primary visual cortex of unconstrained mice demonstrate a saccade-locked coarse-to-fine processing sequence akin to what is observed in primates. By combining this diverse yet highly complementary expertise, we hope to open the conference up to novel audiences, stimulate future comparative work and, potentially, uncover unifying principles of active vision across species.

Talk 1

Neural mechanisms for active eye movements in Drosophila

Lisa Fenk1; 1Max Planck Institute for Biological Intelligence

Our work focuses on understanding two fundamental aspects of active vision. How do brains ignore aspects of the changing sensory stream that are not informative for the task at hand? And, perhaps more remarkably, how do brains actively move their sensors to create sensory patterns of activity that enhance their perception of the world? We use the Drosophila visual system to study both of these sensory challenges in a genetic model organism. During fast flight turns, we observe motor-related inputs to Drosophila visual cells whose properties suggest that they briefly abrogate the cells’ visual-sensory responses. Rather than a wholesale shutdown of the visual system during flight turns, fly visual neurons receive targeted inputs that are precisely calibrated to abrogate each cell’s expected visual response, suggesting that they function as “efference copies.” While flies suppress the perception of self-generated visual motion during flight turns, they also purposefully generate visual motion in other circumstances. We recently discovered that fruit flies move their retinas via tiny muscles, both seemingly spontaneously and in response to visual motion. These movements share surprising similarities with our vertebrate eye movements. We now leverage fly retinal movements as a relatively simple model to examine the cellular underpinnings of active visual processing. We aim to understand how fly eye movements are controlled neuronally, how the brain processes input from moving eyes, and how visual perception ultimately benefits from eye movements.

Talk 2

Linking brain and behavior in insect orientation

Basil el Jundi1; 1Norwegian University of Science and Technology

Insects are clearly among the most capable navigators on Earth. Their navigation abilities range from the simple orientation behaviors of dung beetles that roll dung balls along straight-line paths to the more sophisticated navigational skills of Monarch butterflies that migrate over large distances between their breeding and non-breeding habitats. But what kind of visual and non-visual cues do these insects use to effectively exhibit different kinds of navigation strategies? How does their brain, which is even smaller than a grain of rice, use these cues to control an insect’s steering direction, and how similar are the neural principles found in insects to those of vertebrates? In all insects, a highly conserved brain region, termed the central complex, acts as the internal compass for spatial orientation and navigation. To understand how the central complex encodes orientation based on multimodal sensory information, my research group investigates the behavioral and neural principles of the Monarch butterfly and dung beetle navigation systems. Our recent results suggest that their brains are equipped with head-direction and goal-direction neurons, similar to the ones described in the vertebrate brain. Thus, insects offer a unique window into the core behavioral and neuronal principles of animal navigation and how they are applied under ecologically relevant settings.

Talk 3

Understanding visual processing during saccades using zebrafish

Lisa Bauer1, J. C. Donovan1, H. Baier1; 1Max Planck Institute for Biological Intelligence

Saccadic eye movements are fundamental to vertebrate visual perception, yet it is still unclear exactly how circuits smoothly process the resulting shifts in visual input. The zebrafish model, with its experimental tractability, offers an ideal system for investigating the underlying sensorimotor circuits. Zebrafish perform spontaneous and visually induced saccades as early as 4 days post fertilization (dpf). Here, we combined eye tracking and two-photon calcium imaging to investigate the neuronal correlates of saccades in larval zebrafish (6-8 dpf). We focus on the optic tectum (OT), the fish equivalent of the mammalian superior colliculus (SC), as well as the largest visual brain area. Using a two-photon microscope custom-modified with a remote focusing path to enable rapid multi-plane imaging, we record single-cell resolution neuronal activity across the OT at 5 volumes per second. Even in the absence of visual stimuli we find neurons in the OT that showed increased activity correlated with spontaneous saccades. To investigate how visual stimuli are integrated, we recorded from the same neurons during various visual stimulus paradigms. Our findings reveal that most spontaneous saccade-correlated neurons in the OT respond similarly regardless of visual environment. Moreover, spontaneous saccade-responding neurons are a subset of visually induced saccade neurons. Notably, while many neurons were active around the time of a saccade, certain neurons’ activity peaked before, suggesting a role in anticipatory motor planning. Our results underscore the effectiveness of the larval zebrafish as a model for functional investigation, enabling experimental approaches that are challenging to implement in primate models.

Talk 4

Functional connectivity constrained simulations of visuomotor circuits in zebrafish

Eva Naumann1; 1Duke University School of Medicine

Visual motion processing in the brain is critical for generating movements with appropriate speed and vigor. However, single-cell mechanistic characterizations in vertebrates remain challenging due to the complexity of mammalian brains. The translucent larval zebrafish provides an important model for studying brain-wide visual computations at the cellular level. A key visuomotor transformation in zebrafish is the optomotor response (OMR), where fish stabilize their body position in response to optic flow. The underlying neural circuits involve the retinorecipient pretectum (Pt) and descending motor command neurons in the midbrain nucleus of the medial longitudinal fasciculus (nMLF). By modeling these circuits in a physics-based neuromechanical simulation, we show that the functional connections between these populations are critical for accurate speed adaptation in the simulation. To causally map how neurons interact and compute visual motion information, we integrated volumetric two-photon microscopy with simultaneous 3D holographic optogenetic photostimulation during visual stimulation and tail tracking. Using these all-optical methods, we uncovered the cellular level Pt-nMLF functional connectivity, defined as a neuron’s functional identity or ‘receptive field’ and its functional role in the circuit’s computation or ‘projective field’. Our findings reveal that specific visually responsive Pt subtypes differentially modulate specific nMLF neural activity, forming correlation-based functional connectomes that guide motor output. We applied these experimentally derived functional connectivity weights to update our model, improving its behavioral response to variable-speed visual stimuli. These results highlight how all-optical methods can map functional connections to provide new insight into brain-scale sensorimotor transformations in vertebrates.

Talk 5

How freely moving animals move their eyes during predator/prey interactions and navigate

Jason Kerr1; 1Max Planck Institute for Neurobiology of Behavior

During prey pursuit, how eye-rotations, such as saccades, enable continuous tracking of erratically moving targets while simultaneously enabling an animal to navigate through the environment is unknown. To better understand this, we measured head and eye rotations in freely running ferrets during pursuit behavior. By also tracking the target and all environmental features we reconstructed the animal’s visual fields and their relationship to retinal structures. In the reconstructed visual fields, the target position clustered on and around the high acuity retinal area location, the area centralis, and surprisingly this cluster was not significantly shifted by digital removal of either eye saccades, exclusively elicited when the ferrets made turns, or head rotations which were tightly synchronized with the saccades. Here we show that, while the saccades did not fixate the moving target with area centralis, they instead aligned the area centralis with the intended direction of travel. This also aligned the area centralis with features of the optic flow pattern, such as flow direction and focus of expansion, used for navigation by many species. While saccades initially rotated the eyes in the same direction as the head turn, saccades were followed by eye rotations countering the ongoing head rotation, which reduced image blur and limited information loss across the visual field during head-turns. As we measured the same head and eye rotational relationship in freely moving tree shrews, rats and mice, we suggest these saccades and counter-rotations are a generalized mechanism enabling mammals to navigate complex environments during pursuit.

Talk 6

Neural coding and circuitry of active vision in mice

Philip Parker1; 1Rutgers University

Visual perception is an active process: we constantly move our eyes, head, and body to fully perceive the world around us. Research over the last century has yielded incredible insight into how neurons in the visual system process information, yet our understanding is largely limited to conditions of ‘passive’ rather than ‘active’ vision. This is primarily due to the fact that experiments are traditionally performed under physically restrictive conditions that prevent the natural sensory consequences of movement (e.g. head-restrained animals presented with isolated stimuli). How does visual processing occur under ethological conditions, where animals freely explore complex visual environments in a goal-dependent manner? We addressed this question by performing visual physiology in freely moving mice, recording the activity of more than 100 V1 neurons while measuring the visual input with a head-mounted camera. In addition to mapping spatiotemporal receptive fields in freely moving animals, we found that V1 neurons jointly code for eye and head position - a coding scheme useful for performing retinocentric-to-egocentric reference frame transformations. We also found that V1 neurons fire in a sequence around saccadic eye movements according to increasing spatial frequency preference, consistent with coarse-to-fine models of visual scene processing. To address how these phenomena relate to goal-directed behavior, ongoing work in the lab is focused on the neural circuits and coding underlying visual distance estimation. Together, we are moving toward a greater understanding of how our visual systems operate under real-world conditions.