In the visual system of invertebrates and vertebrates there are specialised groups of motion-sensitive neurons, with large receptive fields, which are optimally tuned to respond to optic flow produced by the animals' movement through the 3-D world. From their response characteristics, shared frame of reference with the vestibular or inertial system, and anatomical connections, these neurons have been implicated in the stabilisation of retinal images, the control of posture and balance, and the animal's motion trajectories through the world. Using standard electrophysiological techniques and computer-generated stimuli, we show that some of these flow-field neurons in the pretectal nucleus lentiformis mesencephali in pigeons appear to be processing motion parallax. Two large overlapping planes of random dots moving independently were used to simulate motion parallax, in which one with larger dots was moved fast and the other with smaller dots was moved slowly in the opposite direction. Their neural responses to these two superimposed planes were facilitated above those produced by a single plane of moving dots and those produced by two layers moving in the same direction. Furthermore, some of these neurons preferred backward motion in the visual field and others preferred forward motion, suggesting that they may separately code visual objects ‘nearer’ and ‘farther’ than the stabilised (‘on’) plane during forward translational motion. A simple system is proposed whereby the relative activity in ‘near’, ‘far’ and ‘on’ populations could code depth through motion parallax in a metameric manner similar to that employed to code color vision and stereopsis.
Using electrophysiological techniques, we show that the pretectal neurons in pigeons code depth through motion parallax similar to that employed to stereopsis. During forward translational motion, some neurons preferred backward fast motion of flow-fields and others preferred forward fast motion, and both were facilitated by slow flow-field motion in the opposite direction, suggesting they may code visual objects ‘nearer’ and ‘farther’ than the stabilized (‘on’) plane coded by slow neurons.