Material image morph and binocular integration
Poster Presentation: Saturday, May 17, 2025, 2:45 – 6:45 pm, Pavilion
Session: Color, Light and Materials: Surfaces and materials
Schedule of Events | Search Abstracts | Symposia | Talk Sessions | Poster Sessions
Hua-Chun Sun1,2 (), Roland Fleming1,2; 1Justus Liebig University Giessen, Germany, 2Center for Mind, Brain and Behavior (CMBB), University of Marburg, Justus Liebig University Giessen and TU Darmstadt, Germany
Material perception involves both recognising materials, and determining their physical properties, even if they are unfamiliar to us. It is widely believed that material recognition involves representing materials (e.g., wood vs metal) in multidimensional feature spaces which cluster samples of each familiar material within a tight neighbourhood, facilitating recognition and discrimination. Yet a key open question is how we perceive the properties of materials that project to intermediate locations within such spaces (e.g., between wood and metal). Here, we employed deep learning-based image interpolation plus interocular fusion to probe material space, and measure how the brain integrates material information from the two eyes. We selected 20 images from the STUFF dataset (Schmidt, Hebart, Schmid & Fleming, 2023) to generate 10 cross-category morph pairs (e.g., sand-grass, moss-hair). Observers were presented with two types of stimuli: a reference stimulus, in which each eye viewed a different weighted combination of the morph pair (e.g., 30% sand + 70% grass in the left eye and 70% sand + 30% grass in the right eye), and a match stimulus, presented identically to both eyes. Participants adjusted the match stimulus along 49 morph steps (ranging from 2% to 98%) between the original images to achieve perceptual equality with the reference stimulus. Trials with unfusable or rivalry perception are removed from further analysis. Intriguingly, our results show that perceptually distinct materials presented to each eye can be integrated into a coherent novel material percept. The adjusted morph weight for the match stimuli was approximately the midpoint of the two images, with greater variance observed in material pairs with larger interocular weight differences. These findings suggest an interocular summation mechanism in the brain for combining material information from the two eyes.
Acknowledgements: This research is funded by the DFG (222641018 – SFB/TRR 135 TP C1), the HMWK (“The Adaptive Mind”) and European Research Council Grant ERC-2022-AdG “STUFF” (project number 101098225).