Integration efficiency for speech perception within and across sensory modalities by normal-hearing and hearing-impaired individuals

Ken W. Grant*, Jennifer B. Tufts, Steven Greenberg

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

26 Scopus citations

Abstract

In face-to-face speech communication, the listener extracts and integrates information from the acoustic and optic speech signals. Integration occurs within the auditory modality (i.e., across the acoustic frequency spectrum) and across sensory modalities (i.e., across the acoustic and optic signals). The difficulties experienced by some hearing-impaired listeners in understanding speech could be attributed to losses in the extraction of speech information, the integration of speech cues, or both. The present study evaluated the ability of normal-hearing and hearing-impaired listeners to integrate speech information within and across sensory modalities in order to determine the degree to which integration efficiency may be a factor in the performance of hearing-impaired listeners. Auditory-visual nonsense syllables consisting of eighteen medial consonants surrounded by the vowel [a] were processed into four nonoverlapping acoustic filter bands between 300 and 6000 Hz. A variety of one, two, three, and four filter-band combinations were presented for identification in auditory-only and auditory-visual conditions: A visual-only condition was also included. Integration efficiency was evaluated using a model of optimal integration. Results showed that normal-hearing and hearing-impaired listeners integrated information across the auditory and visual sensory modalities with a high degree of efficiency, independent of differences in auditory capabilities. However, across-frequency integration for auditory-only input was less efficient for hearing-impaired listeners. These individuals exhibited particular difficulty extracting information from the highest frequency band (4762- 6000 Hz) when speech information was presented concurrently in the next lower-frequency band (1890- 2381 Hz). Results suggest that integration of speech information within the auditory modality, but not across auditory and visual modalities, affects speech understanding in hearing-impaired listeners.

Original languageEnglish
Pages (from-to)1164-1176
Number of pages13
JournalJournal of the Acoustical Society of America
Volume121
Issue number2
DOIs
StatePublished - 2007
Externally publishedYes

Fingerprint

Dive into the research topics of 'Integration efficiency for speech perception within and across sensory modalities by normal-hearing and hearing-impaired individuals'. Together they form a unique fingerprint.

Cite this