Cochlear implants provide a degraded subset of auditory cues, which patients must learn to interpret despite the limitations of the device. When speech is degraded, listeners rely on explicit processing to infer the missing speech information. The goal of this project is to identify the specific cognitive and linguistic skills that listeners who have used cochlear implants from an early age use to explicitly process speech cues. Specifically, we will measure working memory, attention, linguistic knowledge, and non-verbal intellectual ability, and quantify the relationship between these skills and speech recognition.
We will develop and refine our set of experiments in adults with normal hearing, then revise these experiments for use in children with normal hearing, and finally test individuals with cochlear implants. Our experiments include a mixture of well-established, clinically relevant metrics, as well as innovative new approaches that provide more precise estimates of these cognitive and linguistic factors.
These experiments will allow us to compare results to previous literature and expand the set of experimental tools used within this field. Within-group models of cognitive and linguistic predictors of speech recognition will refine existing models of how these factors interact to influence recognition of degraded speech, and across-group comparisons will identify differences in speech recognition mechanisms across groups.
Results of these model comparisons will identify differences in the development of speech recognition in listeners with normal hearing and cochlear implants, and will identify specific factors that could be targeted for clinical intervention to improve speech recognition outcomes in listeners with cochlear implants. This project is funded by a NIH Centers of Biomedical Research Excellence (COBRE) grant (NIH-NIGMS / 5P20GM109023-04).
Our lab is located in the Lied Learning and Technology Center in the Boys Town National Research Hospital.
The lab contains a sound booth that we use to test listeners with their everyday hearing, research interfaces that allow us to directly control the stimulation provided by cochlear implants, and space for visitors accompanying participants to our lab to sit and relax. Participants in our studies interact with custom software on our computers to complete experiments at their own pace.