Audio-visual sound localization in virtual reality

Authors

  • Thirsa Huisman Centre for Applied Hearing Research, Technical University of Denmark, DK-2800 Lyngby, Denmark
  • Tobias Piechowiak GN Hearing, GN ReSound, Region Hovedsteden, Denmark
  • Torsten Dau Centre for Applied Hearing Research, Technical University of Denmark, DK-2800 Lyngby, Denmark
  • Ewen MacDonald Centre for Applied Hearing Research, Technical University of Denmark, DK-2800 Lyngby, Denmark

Abstract

Virtual reality (VR) can be a strong research tool in audio-visual (AV) experiments. It allows us to investigate AV integration in complex and realistic settings. Here, using a VR setup-up in combination with a loudspeaker array, 16 normal-hearing participants were tested on their sound localization abilities. The virtual environment consisted of a 1:1 model of the experimental environment except with the loudspeaker array replaced by a ring. This ring indicated the height, but not the position of the loudspeakers. The visual component of the stimuli consisted of a ball falling and then bouncing once on the ring after which it disappeared. As the ball collided with the ring, an impact sound was played from a loudspeaker. Participants were asked to indicate the apparent sound origin, for both congruent and incongruent visual and audio spatial positions ranging from -30 to 30 degrees. The VR visual stimuli in combination with real auditory stimuli were capable of inducing AV integration. The range of this integration extended, for several participants, over large ranges of AV disparity compared to some earlier studies.

References

Ahrens, A., Lund, K. D., Marschall, M., and Dau, T. (2019). “Sound source localization with varying amount of visual information in virtual reality,” PLoS ONE, 14(3), 1–19. doi: 10.1371/journal.pone.0214603
Battaglia, P. W., Jacobs, R. A., and Aslin, R. N. (2003). “Bayesian integration of visual and auditory signals for spatial localization,” J. Opt. Soc. Am. A., 20(7), 1391. doi: 10.1364/JOSAA.20.001391
Bosen, A. K., Fleming, J. T., Brown, S. E., Allen, P. D., O’Neill, W. E., and Paige, G. D. (2016). “Comparison of congruence judgment and auditory localization tasks for assessing the spatial limits of visual capture,” Biol. Cybern., 110(6), 455– 471. doi: 10.1007/s00422-016-0706-6
Freeman, L. C. A., Wood, K. C., and Bizley, J. K. (2018). “Multisensory stimuli improve relative localisation judgments compared to unisensory auditory or visual stimuli,” J. Acoust. Soc. Am., 143(6). doi: 10.1121/1.5042759
Jackson, C. V. (1953). “Visual Factors in Auditory Localization,” Q. J. Exp. Psychol, 5(2), 52–65. doi: 10.1080/17470215308416626
Meijer, D., Veselič, S., Calafiore, C., and Noppeney, U. (2019). “Integration of audiovisual spatial signals is not consistent with maximum likelihood estimation,” Cortex, 119, 74–88. doi: 10.1016/j.cortex.2019.03.026
Odegaard, B., Wozny, D. R., and Shams, L. (2015). “Biases in visual, auditory, and audiovisual perception of space,” PLoS Comput. Biol., 11(12), 1–23. doi: 10.1371/journal.pcbi.1004649
Schwarz, G. (1978). “Estimating the dimension of a model,” Ann. Stat., 6(2), 461– 464.
Wozny, D. R., Beierholm, U. R., and Shams, L. (2010). “Probability matching as a computational strategy used in perception,” PLoS Comput. Biol., 6(8). doi: 10.1371/journal.pcbi.1000871

Additional Files

Published

2020-04-23

How to Cite

Huisman, T., Piechowiak, T., Dau, T., & MacDonald, E. (2020). Audio-visual sound localization in virtual reality. Proceedings of the International Symposium on Auditory and Audiological Research, 7, 349–356. Retrieved from https://proceedings.isaar.eu/index.php/isaarproc/article/view/2019-40

Issue

Section

2019/5. Other topics in auditory and audiological research