In reality, speech is just noise. It’s a series of human-produced noises that are loud and soft, high and low, long and short. The thing that distinguishes speech from any other noise is how we perceive it, breaking down each series of consonants and vowels into syllables, words and sentences.
However, every year between one and three percent of babies are born with hearing loss, a condition that will negatively affect how they perceive this noise for the rest of their lives. Or will it?
Marc Brennan, Ph.D., an Adjunct Professor at UNO and Director of the Amplification and Perception Laboratory at Boys Town National Research Hospital, is conducting research with the goal of eliminating the gap between speech perception of hearing-impaired and normal-hearing individuals.
Born with hearing loss that was diagnosed at age 4, Dr. Brennan knows first-hand the difference that hearing aids can make, but he says he also realizes their limitations. “That is one of the reasons why I decided to help other people out with hearing loss by becoming an Audiologist and pursuing research on methods to improve the ability of children and adults with hearing loss to understand spoken language,” Dr. Brennan said.
His current research project will be conducted over the course of three years, using funds from a recent $11.3 million grant from the National Institutes of Health to create a Center for Perception and Communication in Children, an Institutional Development Award (IDeA) Center for Biomedical Research Excellence. Dr. Brennan’s lab is one of five at Boys Town Hospital that will be devoted to understanding the consequences of childhood hearing loss.
Dr. Brennan and his team will be conducting a series of experiments, each lasting an entire semester. Children with normal-hearing and hearing loss, ages 6-16, and adults will be invited to take part in studies that examine how temporal cues are used to understand speech.
For example, in one experiment, Dr. Brennan and his team will be observing gap detection. When you hear someone say “bird,” there is a small interval of silence embedded in the ‘d’ that results in a “duh” sound. In technical terms, the ‘d’ is a stop consonant. The lab’s goal is to determine how small that silent interval can be and still be detectable, both with and without a hearing aid.
“Hearing loss makes it difficult for children to acquire spoken language, even when fit with appropriate hearing aids,” Dr. Brennan said. “A child’s access to the acoustic cues in speech depends on the ability of a hearing aid to amplify those cues for the child, yet we know little about the effect of hearing aids or hearing loss on a child’s ability to use the acoustic cues in speech. Our series of experiments are designed to optimize a child’s access to these cues and should lead to improved outcomes for children with hearing loss.”
After three years of experiments that examine different components of spoken language, the lab hopes to discover a better way to set up hearing aids, improving the quality of sound and changing the way that America cares for those with hearing loss.