Interactive effects of hand-proximity and emotion on vision
26.414, Saturday, May 11, 2:45 - 6:45 pm, Orchid Ballroom
Blaire Weidler1, Richard Abrams1; 1Psychology Department, Washington University in St. Louis
Recent research has revealed that there are remarkable changes in vision for stimuli near the hands. However, the mechanisms underlying these changes remain unclear. We present research suggesting that the mechanism underlying changes in vision for stimuli near the hands may be similar to the mechanism activated following exposure to emotional stimuli. In Experiment 1 participants detected the orientation of high spatial frequency (HSF) and low spatial frequency (LSF) gabor patches with their hands near and far from the stimuli. Just as LSF sensitivity is enhanced following exposure to fearful stimuli, we found LSF sensitivity was enhanced for stimuli near the hands. These data imply that activity in the magnocellular channel, which primarily processes LSF information, is enhanced for stimuli near the hands just as it is for emotional stimuli. In Experiments 2 and 3 we demonstrated that reducing magnocellular processing (by exposing participants to diffuse red light) eliminated effects of hand nearness on vision. Specifically, in Experiment 2, against a red background participants showed no differences in spatial frequency sensitivity across hand positions. Furthermore, in Experiment 3 participants performed a visual search task (for a letter target amongst distractors) on both a green and a red background. Against the green background, we replicated the typical finding of slower rates of search for stimuli near the hands. However, against the red background hand position did not modulate search rates. Finally, in Experiment 4 we directly investigated the interaction between hand position and emotion. When participants searched for a target embedded in a fearful or neutral image, effects of hand position and emotion interacted, indicating that emotion and hand nearness may act through a similar mechanism. Specifically, whereas accuracy was poorer for neutral images in the hands-far condition, when hands were near the stimuli accuracy did not differ based on emotion.