Audio-visual sound localization in virtual reality
Abstract
Virtual reality (VR) can be a strong research tool in audio-visual (AV) experiments. It allows us to investigate AV integration in complex and realistic settings. Here, using a VR setup-up in combination with a loudspeaker array, 16 normal-hearing participants were tested on their sound localization abilities. The virtual environment consisted of a 1:1 model of the experimental environment except with the loudspeaker array replaced by a ring. This ring indicated the height, but not the position of the loudspeakers. The visual component of the stimuli consisted of a ball falling and then bouncing once on the ring after which it disappeared. As the ball collided with the ring, an impact sound was played from a loudspeaker. Participants were asked to indicate the apparent sound origin, for both congruent and incongruent visual and audio spatial positions ranging from -30 to 30 degrees. The VR visual stimuli in combination with real auditory stimuli were capable of inducing AV integration. The range of this integration extended, for several participants, over large ranges of AV disparity compared to some earlier studies.
References
Battaglia, P. W., Jacobs, R. A., and Aslin, R. N. (2003). “Bayesian integration of visual and auditory signals for spatial localization,” J. Opt. Soc. Am. A., 20(7), 1391. doi: 10.1364/JOSAA.20.001391
Bosen, A. K., Fleming, J. T., Brown, S. E., Allen, P. D., O’Neill, W. E., and Paige, G. D. (2016). “Comparison of congruence judgment and auditory localization tasks for assessing the spatial limits of visual capture,” Biol. Cybern., 110(6), 455– 471. doi: 10.1007/s00422-016-0706-6
Freeman, L. C. A., Wood, K. C., and Bizley, J. K. (2018). “Multisensory stimuli improve relative localisation judgments compared to unisensory auditory or visual stimuli,” J. Acoust. Soc. Am., 143(6). doi: 10.1121/1.5042759
Jackson, C. V. (1953). “Visual Factors in Auditory Localization,” Q. J. Exp. Psychol, 5(2), 52–65. doi: 10.1080/17470215308416626
Meijer, D., Veselič, S., Calafiore, C., and Noppeney, U. (2019). “Integration of audiovisual spatial signals is not consistent with maximum likelihood estimation,” Cortex, 119, 74–88. doi: 10.1016/j.cortex.2019.03.026
Odegaard, B., Wozny, D. R., and Shams, L. (2015). “Biases in visual, auditory, and audiovisual perception of space,” PLoS Comput. Biol., 11(12), 1–23. doi: 10.1371/journal.pcbi.1004649
Schwarz, G. (1978). “Estimating the dimension of a model,” Ann. Stat., 6(2), 461– 464.
Wozny, D. R., Beierholm, U. R., and Shams, L. (2010). “Probability matching as a computational strategy used in perception,” PLoS Comput. Biol., 6(8). doi: 10.1371/journal.pcbi.1000871
Additional Files
Published
How to Cite
Issue
Section
License
Authors who publish with this journal agree to the following terms:
a. Authors retain copyright* and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
b. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
c. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).
*From the 2017 issue onward. The Danavox Jubilee Foundation owns the copyright of all articles published in the 1969-2015 issues. However, authors are still allowed to share the work with an acknowledgement of the work's authorship and initial publication in this journal.