The Object of Time: Temporal Perception of Objects is Improved by Proximity and Singularity
Poster Presentation: Saturday, May 17, 2025, 2:45 – 6:45 pm, Pavilion
Session: Temporal Processing: Duration, timing perception
Schedule of Events | Search Abstracts | Symposia | Talk Sessions | Poster Sessions
Alex Ma1, Martin Wiener1; 1George Mason University
The activities of daily life necessitate moving through spaces and visually processing objects at various distances. Research shows that visual responses for objects differs to visual responses for scenes containing many objects (Josephs & Konkle, 2020). Separately, the properties of visual scenes are known to dilate and contract subjectively perceived duration (Ma, et al. 2024). These experiments explored the possibility that the perceived distance of objects within an image could influence time perception, through a dataset which categorized images into portraits of singular objects, scenes full of objects, and “reachspaces”, which contained objects that appeared at a reachable distance (Josephs, et al. 2021). The first experiment (n=20) used 216 images treated to reduce the effect of low-level image properties like color and brightness, while the second experiment (n=20) additionally used contrast normalized images. Both experiments employed sub-second temporal bisection tasks, and featured an eye tracking component to control for the effects of saccade density. The results for both experiments showed that the perceived distance did not significantly dilate or contract time perception, but that the durations of progressively more proximal, singular object images were linearly perceived more precisely and processed more quickly. This pattern of results could imply the processing of singular objects in images involves a mechanism of divisive normalization with correlated noise, in which a greater number of neural processes are required for a greater number of objects. The greater number of object processes for wider scene images would then combine their results together, leading to greater variance compared to singular object images. These findings suggest that the duration of images with a large number of objects and images with a single object are processed differently at a perceptual level, with implications for both object processing and models of time perception.