Multi-Sensory Integration

This branch of research focuses on multisensory information processing, specifically interaction between touch, audition and vision in action-based tasks. Our approach aims to provide empirical data useful in the development of theories of how information from multiple sensory modalities is integrated when perceivers act upon the events in the environment. We investigate these processes in real-life and virtual environments. The recent availability of virtual reality techniques makes it possible to design new kinds of studies since multimodal stimuli can be conveniently synthesized under computer control. Recent projects on this topic investigate

  1. the effect of vibrations on auditory judgments, a collaborative project conducted with A. Berry (Université de Sherbrooke), Bombardier and CAE on soundfield rendering for simulating ambient noise in aircrafts cabins and cockpits. We currently investigate the influence of vibrations on auditory judgments (e.g. localization tests, auditory comfort evaluation) of concurrent sounds to determine the extent to which vibrations contribute to the recreated sonic environment.
  2. audio-tactile temporal alignment, conducted in collaboration with V. Hayward (Université Paris 6, France). We investigate here one of the major questions in the study of multisensory perception, namely how people “decide” whether inputs coming in through their various senses (e.g., hearing and touch were generated by the same event (Frissen et al., submitted).
  3. active perception of rolling events, in collaboration with V. Hayward (Université Paris 6, France). In order to evaluate how easy it is to control an interface using the sense of touch, we asked people to estimate various distances a ball had rolled inside a tube, while having only haptic feedback at their disposal. Using a virtual ball we were able to isolate the various physical cues that are available during the course of a ball’s movement (Frissen et al. submitted).

Together, the results of this line of research will extend our knowledge of the way the different modalities are integrated and will provide important information for the design of multimodal interfaces that are tuned to human sensory information processing and action.

Comments are closed.

Search