
Effects of the Configuration of Hearing Loss on Consonant Perception between Simulated Bimodal and Electric Acoustic Stimulation Hearing
Background Cochlear implant technology allows for acoustic and electric stimulations to be combined across ears (bimodal) and within the same ear (electric acoustic stimulation [EAS]). Mechanisms used to integrate speech acoustics may be different between the bimodal and EAS hearing, and the configurations of hearing loss might be an important factor for the integration. Thus, differentiating the effects of different configurations of hearing loss on bimodal or EAS benefit in speech perception (differences in performance with combined acoustic and electric stimulations from a better stimulation alone) is important.
Purpose Using acoustic simulation, we determined how consonant recognition was affected by different configurations of hearing loss in bimodal and EAS hearing.
Research Design A mixed design was used with one between-subject variable (simulated bimodal group vs. simulated EAS group) and one within-subject variable (acoustic stimulation alone, electric stimulation alone, and combined acoustic and electric stimulations).
Study Sample Twenty adult subjects (10 for each group) with normal hearing were recruited.
Data Collection and Analysis Consonant perception was unilaterally or bilaterally measured in quiet. For the acoustic stimulation, four different simulations of hearing loss were created by band-pass filtering consonants with a fixed lower cutoff frequency of 100 Hz and each of the four upper cutoff frequencies of 250, 500, 750, and 1,000 Hz. For the electric stimulation, an eight-channel noise vocoder was used to generate a typical spectral mismatch by using fixed input (200‐7,000 Hz) and output (1,000‐7,000 Hz) frequency ranges. The effects of simulated hearing loss on consonant recognition were compared between the two groups.
Results Significant bimodal and EAS benefits occurred regardless of the configurations of hearing loss and hearing technology (bimodal vs. EAS). Place information was better transmitted in EAS hearing than in bimodal hearing.
Conclusion These results suggest that configurations of hearing loss are not a significant factor for integrating consonant information between acoustic and electric stimulations. The results also suggest that mechanisms used to integrate consonant information may be similar between bimodal and EAS hearing.
Keywords: acoustic; electric; integration; residual hearing; thresholds
Document Type: Research Article
Affiliations: 1: Department of Communication Sciences and Disorders, Baylor University, Waco, Texas 2: Division of Otolaryngology, Baylor Scott & White Medical Center, Temple, Texas 3: Department of Speech, Language, and Hearing, School of Behavioral and Brain Sciences Callier Center for Communication Disorders, The University of Texas at Dallas, Richardson, Texas
Publication date: September 1, 2021
The Journal of the American Academy of Audiology (JAAA) is a scholarly peer-reviewed publication and the official journal of the American Academy of Audiology. JAAA publishes articles and clinical reports in all areas of audiology, including audiological assessment, amplification, aural habilitation and rehabilitation, auditory electrophysiology, vestibular assessment, hearing and balance public health, and hearing and vestibular science. The journal is an online-only publication with a related continuing-education assessment program available to Academy members. Beginning in January 2025, the Academy resumed its role as the publisher of JAAA.
- Information for Authors
- Submit a Paper
- Membership Information
- Ingenta Connect is not responsible for the content or availability of external websites
- Access Key
- Free content
- Partial Free content
- New content
- Open access content
- Partial Open access content
- Subscribed content
- Partial Subscribed content
- Free trial content