Spatio-chromatic cues in shape and material perception

Poster Presentation: Saturday, May 17, 2025, 2:45 – 6:45 pm, Pavilion
Session: Color, Light and Materials: Surfaces and materials

Zoe R. Goll1, Filipp Schmidt1,2, Emily J. A-Izzeddin1, Celine Aubuchon1, Fatma Kiliç, Roland W. Fleming1,2; 1Justus Liebig University Giessen, Germany, 2Center for Mind, Brain and Behavior, Universities of Marburg, Giessen, and Darmstadt

Every day, we interact with a variety of objects made from different materials and make inferences about their properties based on appearance. These inferences are complex, as the spatial and spectral properties of light reflected from these objects depend on illumination, shape, and material. Here, we focused on the role of colour in the perception of shape and material - and especially the role of spatial gradients of chromatic features - which remains poorly understood. We investigated how colour saturation and value gradients across an object’s surface influence our perception of gloss and three-dimensional (3D) shape. We rendered images of 3D objects, each having identical shape and illumination, but differing in their material appearances. We then, in HSV colour space, removed either saturation or value gradients by replacing all values in that dimension with the mean. Participants were presented with four versions of each image simultaneously: the original rendering, versions with no saturation or value gradients, and a grayscale version. For a given image set, participants first rated the glossiness of each object version, followed by how 3D they appeared. Compared to the original, there was a significant reduction in perceived gloss for the versions with either saturation or value gradients removed. In addition, 3D ratings were lower for the version with value gradients removed. These results demonstrate the impact of saturation and value gradients on gloss perception and value gradients on perceived 3D shape. This suggests an important role for such spatio-chromatic cues in shape and material perception.

Acknowledgements: Supported by DFG, project number 222641018–SFB/TRR 135 TP C1, the ERC Advanced Grant "STUFF" (project number ERC-2022-AdG-101098225); and the Research Cluster "The Adaptive Mind" funded by the Hessian Ministry for Higher Education, Research, Science and the Arts.