Monday, May 12, 2008, 6:30 – 9:30 pm
BBQ 6:30 – 8:30 pm Vista Ballroom, Vista Terrace and Sunset Deck
Demos 7:30 – 9:30 pm Royal Palm foyer, Acacia Meeting Rooms
Please join us Monday night for the 6th Annual VSS Demo Night, a spectacular night of imaginative demos, social interaction and delectable food. This year’s BBQ will be held on the beautiful Sunset Terrace and Vista Deck overlooking the Naples Grande main pool. Demos will be located upstairs on the ballroom level in the Royal Ballroom foyer and Acacia Meeting Rooms.
Richard O. Brown, Arthur Shapiro and Shin Shimojo have curated 21 demonstrations of visual phenomena by VSS members, highlighting the important roles demonstrations play in vision research and education.
Demo Night is free for all registered VSS attendees. Meal tickets are not required, but you must wear your VSS badge for entry to the BBQ. Guests and family members of all ages are welcome to attend the demos, but must purchase a ticket for the BBQ . You can register your guests at any time during the meeting at the VSS Registration Desk located in the Royal Ballroom foyer. A desk will also be set up at the entrance to the BBQ in the Vista Ballroom beginning at 6:00 pm on Monday night.
Guest prices: Adults: $30, Youth (6-12 years old): $15, Children under 6: free
Wide field of view HMD walking experience in Virtual Reality
Bryce Armstrong and Matthias Pusch; WorldViz LLC
New demo worlds by WorldViz will immerse participants at higher levels with a new high-speed wide area tracking system and new wide FOV HMD setup with improved resolution.
LITE Vision Demonstrations
Kenneth Brecher; Boston University
I will present the most recent Project LITE vision demonstrations (including ones not yet posted on the web) – both computer software and new physical objects.
The Blue Arcs – functional imaging of neural activity in your own retina
Richard O. Brown; The Exploratorium
A simple demonstration of the Blue Arcs of the Retina, a beautiful entoptic phenomenon with a long history (Purkinje 1825, Moreland 1968), which deserves to be more widely known.
An opti-mechanical demonstration of differential chromatic and achromatic flicker fusion
Gideon P. Caplovitz and Howard C. Hughes; Dartmouth College
We will present a classic dynamic demonstration of differential flicker fusion rates for achromatic and chromatic flicker, using birefringent materials and polarized light.
Stereo rotation standstill
Max R. Dürsteler; Zurich University Hospital
A rotating spoked wheel defined only by disparity cues appears stationary when fixating the center of rotation. With peripheral fixation, one can infer the wheel’s rotation by tracking single spokes.
Sal, an embodied robotic platform for real-time visual attention, object recognition and manipulation
Lior Elazary, Laurent Itti, Rob Peters and Kai Chang; USC
An integrated robotic head/arm system, controlled by a pair of laptop computers (“dorsal” and “ventral”), will be able to locate, learn, recognize and grasp visual objects in real time.
“The impossible but possible transparency” and other new illusions
Simone Gori and Daniela Bressanelli; University of Trieste and University of Verona
We will demonstrate new motion illusions , including a new effect of transparency that arises in a special condition in which the colors combination contradicts the transparency rules.
A novel method for eye movement detection and fixation training
Parkson Leung, Emmanuel Guzman, Satoru Suzuki, Marcia Grabowecky and Steve Franconeri; Northwestern University
We will demonstrate a rapid contrast-reversing display of random-dots which appears uniform during fixation, but in which the random-dot pattern is perceived during eye movements or blinks.
3D shape recovery from a single 2D image
Yunfeng Li, Tadamasa Sawada, Yll Haxhimusa, Stephen Sebastian and Zygmunt Pizlo; Purdue University
We will demonstrate software that can take a single 2D image of a 3D scene and recover 3D shapes of objects in the scene, based on contours of the objects extracted by hand or automatically.
Rolling perception without rolling motion
Songjoo Oh and Maggie Shiffrar; Rutgers-Newark
We will show that contextual cues systematically trigger the perception of illusory rotation in optically ambiguous, moving homogeneous circles, in which visual cues to rotation are absent.
Pip and pop
Chris Olivers, Erik van der Burg, Jan Theeuwes and Adelbert Bronkhorst; Vrije Universiteit Amsterdam
In dynamic, cluttered displays, a spatially non-specific sound (“pip”) dramatically improves detection and causes “pop out” of a visual stimulus that is otherwise very difficult to spot.
The Phantom Pulse Effect Revisited
David Peterzel; UCSD, SDSU, VA Hospital
The “phantom pulse effect”, in which rapid mirror reversals of one’s body can evoke powerful and unusual visual-tactile, has been optimized and will be demonstrated by two distinct methods.
Mega suppression (aka Granny Smith illusion)
Dr Yury Petrov and Olga Meleshkevich; Northeastern University
A brief change of an object’s color is completely masked when an object of a matching color is simultaneously flashed nearby, when presented in the visual periphery.
Strong percepts of motion through depth without strong percepts of position in depth
Bas Rokers and Thad Czuba; The University of Texas at Austin
Binocularly anticorrelated random dot displays yield poor or nonexistent percepts of depth, but motion through depth percepts for the same stimuli are relatively unaffected.
Perpetual collision, long-range argyles, and other illusions
Arthur Shapiro and Emily Knight; Bucknell University
We will show novel interactive visual effects. Perpetual collisions illustrate global motion percepts from local changes at boundaries. Long-range argyles show strong lightness/brightness differences over large distances.
Illusions that illustrate fundamental differences between foveal and peripheral vision
Emily Knight, Arthur Shapiro and Zhong-Lin Lu; Bucknell University and USC
We will present a series of new interactive displays designed to test the hypothesis that peripheral vision contains less precise spatial and temporal phase information than foveal vision.
Smile Maze:Real-time Expression Recognition Game
Jim Tanaka, Jeff Cockburn, Matt Pierce, Javier Movellan and Marni Bartlett; University of Victoria
Smile Maze is an interactive face training exercise, incorporating the Computer Expression Recognition Toolbox developed at UCSD, in which players must produce target facial expressions to advance.
The Rubber Pencil Illusion
Lore Thaler; The Ohio State University
I will demonstrate the Rubber Pencil Illusion. When a pencil is held loosely and wiggled up and down in a combination of translatory and rotational motion, it appears to bend.
Edgeless filling-in and paradoxical edge suppression
Christopher Tyler; Smith-Kettlewell Eye Research Institute
I will demonstrate that ‘edgeless’ afterimages (Gaussian blobs) appear much more readily than sharp-edged ones, which exhibit a prolonged appearance delay. This is the reverse of edge-based filling-in.
Perception of depth determines the illusory motion of subjective surfaces within a wire cube
Albert Yonas; University of Minnesota
When 3 sides of a concave wire cube are viewed monocularly in front of a surface with minimal texture, it most often appears convex. When the viewer moves, both the cube and the surface appear to rotate.