Wikiversity:Fellow-Programm Freies Wissen/Einreichungen/An open psychophysics study on audiovisual motion perception

Aus Wikiversity
Zur Navigation springen Zur Suche springen

An open psychophysics study on audiovisual motion perception[Bearbeiten]


The research area of multisensory integration studies the combined processing of signals supplied by our different sensory organs, which allows us to form a coherent picture of our environment. The existence of crossmodal biases, where perception in one sense is influenced by information from a different sense, suggests that interaction between modalities is an important aspect in the formation of conscious perception. One prominent example for crossmodal bias is the ventriloquist illusion (Choe, Welch, Gilford, & Juola, 1975), where the localization of sounds is influenced by visual information. In other experimental paradigms, visual perception can also be influenced by auditory signals (Shams, Kamitani, & Shimojo, 2000). A well-established principle that predicts the direction of crossmodal influence in the localization of static audiovisual stimuli is that the respective unisensory signals are combined based on their reliabilities. When variance of a signal is high, its weight in integration is inversely reduced, leading to dominance of the other modality (Alais & Burr, 2004). One aim of this research project is to clarify if this principle also holds for audiovisual motion perception in humans. Furthermore, I want to identify stimulation parameters where movement judgments are biased in one or the other direction. These parameters can then be utilized in a follow-up electroencephalography study to investigate the mechanisms of bidirectional information transfer in multisensory integration.

Twenty participants will be recruited to perform a motion judgement task. Visual stimuli will consist of random dot kinematograms with varying movement coherence, auditory stimuli will consist of white noise convoluted with generic head-related transfer functions to change their apparent source location, overlaid with varying levels of static white noise. Participants will be asked to indicate perceived auditory and visual movement using a button press. The main hypothesis is that the amount of crossmodal bias on motion judgments depends on the relative reliability of the bimodal input signals. All aspects of data acquisition and analysis will be performed using only open source software: the experimental paradigm will be implemented using the psychophysics toolbox (Brainard, 1997) running on Octave (Eaton, Bateman, Hauberg, & Wehbring, 2015). Data analysis will be conducted using R (R Development Core Team, 2008). The experimental and analysis source code will be version controlled and published, alongside the raw and analyzed data, on Open Science Framework. During the course of the project, documentation and notes about progress and emerging problems will be published, applying the principles of open notebook science.


Alais, D., & Burr, D. (2004). The Ventriloquist Effect Results from Near-Optimal Bimodal Integration. Current Biology, 14(3), 257–262.

Brainard, D. H. (1997). The psychophysics toolbox. Spatial Vision, 10(4), 433–436.

Choe, C. S., Welch, R. B., Gilford, R. M., & Juola, J. F. (1975). The “ventriloquist effect”: Visual dominance or response bias? Perception & Psychophysics, 18(1), 55–60.

Eaton, J. W., Bateman, D., Hauberg, S., & Wehbring, R. (2015). GNU Octave version 4.0.0 manual: a high-level interactive language for numerical computations. Retrieved from

R Development Core Team. (2008). R: A Language and Environment for Statistical Computing. Vienna, Austria: R Foundation for Statistical Computing. Retrieved from Shams, L., Kamitani, Y., & Shimojo, S. (2000). What you see is what you hear. Nature, 408(6814), 788.


  • Name: Mathis Kaiser
  • Institution: Charité - Universitätsmedizin Berlin
  • Kontakt: