Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

The detection of a stimulus can be considerably facilitated if the stimulus engages two or more sensory modalities simultaneously. This phenomenon, commonly referred to as multisensory (or cross-modal) facilitation, has been demonstrated behaviorally in cats and humans. A number of rules are thought to govern this phenomenon. These rules state that strong facilitation is to be expected only if the two sensory modalities are stimulated simultaneously and at the same place, and if the stimuli themselves are weak. However, these rules are not sufficient to allow accurate predictions of multimodal stimulus detection probabilities directly from physical stimulus parameters. Here we show that such predictions are possible on the basis of a simple and biologically plausible psychophysical model, which relates the detection of audio-visual, audio-tactile or visual-tactile stimuli to the Euclidean distance that these stimuli span in an orthogonal sensory space.

Original publication




Journal article


Exp Brain Res

Publication Date





181 - 190


Acoustic Stimulation, Auditory Perception, Female, Humans, Male, Models, Biological, Space Perception, Touch, Vibration