Individual differences on an auditory-visual speech perception test for people with hearing loss
Individual differences in auditory-visual speech perception in people with hearing loss were investigated using syllables, words, and sentences. The stimuli were presented in auditory-only, visual-only, and auditory-visual conditions for both congruent and incongruent conditions. In the congruent condition auditory speech stimuli were presented with their identical visual cues, and in the incongruent condition auditory stimuli were presented with conflicting visual cues. Nine young adults with varying degrees of hearing loss, fitted with hearing-aids or cochlear implants participated in the study. The relative increase in auditory-visual speech perception as measured by these tests resulting from the addition of visual cues to the auditory signal was calculated for each condition. The results showed that the subjects were better able to integrate both auditory and visual cues in the auditory-visual congruent condition. The auditory-visual gain in speech perception was less for the incongruent condition. The subjects showed significant individual differences in the amount of gain for different experimental conditions. These results suggest that auditory-visual integration of speech information does occur but that the degree of integration varies among the subjects. The speech stimuli showing the most auditory-visual integration are discussed in the text.
Desai, S., Stickney, G., and Zeng, F.G. (2008). “Auditory-visual speech perception in normal-hearing and cochlear-implant listeners,” J. Acoust Soc Am, 123, 428-440.
Grant, K. and Seitz, P. (1998). “Measures of auditory-visual integration in nonsense syllables and sentences,” J. Acoust. Soc. Am., 104, 2438-2450.
Huyse, A., Berthommier, F., and Leybaert, J. (2012). “Degradation of labial information modifies audiovisual speech perception in cochlea-implanted children,” Ear Hearing, 34, 110-121.
Rouger, J., Fraysse, B., Deguine, O., and Barone, P. (2008). “McGurk effects in cochlear-implanted deaf subjects,” Brain Res., 1188, 87-99.
Schwartz, J. (2010). “A reanalysis of McGurk data suggests that audiovisual fusion in speech perception is subject-dependent,” J. Acoust. Soc. Am., 127, 1584-1594.
Authors who publish with this journal agree to the following terms:
a. Authors retain copyright* and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
b. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
c. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).
*From the 2017 issue onward. The Danavox Jubilee Foundation owns the copyright of all articles published in the 1969-2015 issues. However, authors are still allowed to share the work with an acknowledgement of the work's authorship and initial publication in this journal.