Searching between working-memory versus visual arrays

Poster Presentation: Friday, May 16, 2025, 3:00 – 5:00 pm, Banyan Breezeway
Session: Visual Search: Memory

Dengxinyi Wei1 (), Daniela Gresch1, Anna C. Nobre1; 1Yale University

Selective attention helps us search for relevant items in the external world and in memory. How external and internal search processes compare and how they are coordinated to guide behavior remain open questions. We developed a new experimental design to compare the properties of visual search in perceptual versus working-memory arrays and to reveal how search across the domains is prioritized. Forty-nine healthy young adults were recruited online to perform a combined perceptual and working-memory search task, looking for a target item that could be present either in a previously encoded array or within a visible array. At the start of a trial, four quadrant placeholders appeared, and two items briefly occupied two of the locations (500 ms) before being replaced by the placeholders again (800 ms). Two other items then occupied the other two locations, and a central word designated the target for search. Participants indicated whether the target item was present in either their WM or the visual array (yes/no) and were provided with feedback (correct/incorrect). Targets were equally likely to be present or absent and, when present, to be in the WM or visual array. Accuracy was systematically higher for targets present in the visual display, whereas reaction times were faster for target items present in WM. Ongoing studies are probing the generality of the effects across different types of cues (verbal, object-based, feature-based) and search items (real objects, abstract colored shapes) and using models to test the functional parameters that guide the competition between external and internal search.