Projects:


ELECTROPHYSIOLOGY

 

(ii) Visual and Vestibular Interaction












To successfully navigate through our environment we need a reliable representation of ongoing self motion as well as the motion of surrounding objects. To achieve this important representation it could be dangerous to rely e.g. only on visual information: A large approaching object could cause the same motion pattern on the retina as actual forward directed self motion. To avoid such perceptual misjudgements it is thus crucial to combine information from different sensory sources. This could be most effeciently done in brain areas receiving multimodal motion information. Area VIP is known to be involved in the processing of self motion information and receives input from the vestibular, auditory, visual and somatosensory system. In the present study we therefore compared the responsiveness of area VIP neurons to (A) pure vestibular stimulation (real motion), (B) pure visual self motion simulation and finally (C) bimodal stimulation when visual as well as vestibular information about the actual self motion was available (see METHODS).
Considering linear translational signals we could demonstrate a strong influence of the vestibular system on area VIP (77% of neurons were sensitive to the linear vestibular stimulation applied). Interestingly, for neurons tuned for the direction of the motion stimulus in both, conditions (A) and (B) the self-direction tuning of about half of the neurons was different in the two modalities. The results of the experimental condition (C) revealed that both sensory modalities seem to be equally influential during bimodal stimulation in a way that both could determine the cell’s response. Area VIP thus contains information that could be used to disambiguate self motion information by taking different sensory systems (e.g. the visual and vestibular modality) into account. Yet, future research is needed to determine whether and how the system uses these representations.

(This study was done together with Prof. Frank Bremmer and Prof. Klaus-Peter Hoffmann at the Department of Zoology and Neurobiology (Ruhr-University Bochum, Germany).  (pdf-file) )

METHODS:


(A) vestibular stimulation in darkness

(B) pure visual self motion simulation

(C) bimodal stimulation
The monkey was moved sinusoidally on a linear pathway in forward and backward direction. The room was dark to preclude any visual information. A windshield assured that no tactile stimulation took place, leaving the vestibular modality as only possible source for the judgement of motion direction.
We simulated the same movement directions as in the vestibular stimulation case (A) with the help of optic flow fields (expansion and contraction stimuli simulating forward and backward motion respectively). In this condition only visual motion information could be used to determine the actual motion state.
In this condition the monkey was moved similar as in (A). The only difference was that this time the lights were switched on and a stationary random dot pattern  was projected onto the screen. Thus, visual and vestibular information about the actual self motion was available.