Summary: Recent research has revealed that hearing plays an important role in regulating the movement of speech. Researchers have found that when individuals are temporarily unable to hear their own words, their ability to control jaw and tongue movements decreases.
This finding is particularly important for understanding speech production in people with hearing loss, including those with cochlear implants. This finding may lead to new treatment strategies focused on oral motor training for individuals with hearing loss.
Important facts:
Hearing loss impairs real-time coordination of speech movements. When auditory feedback is reduced, people may become more dependent on oromotor feedback. A new treatment may improve speech for people with hearing loss or cochlear implants.
Source: McGill University
A McGill University study showed that hearing plays a key role in how people coordinate and control the movement of speech in real time.
The study, published in the Journal of the Acoustical Society of America, shows that when people are unable to hear their own words, even for short periods of time, their ability to move their jaw and tongue in a coordinated manner is impaired. .
Lead author Matthew Masaporo, who conducted the study while working as a research associate at McGill’s Motor Neuroscience Institute, said: “People use a variety of techniques to coordinate and control the movements of their vocal tract for speech production. It relies on immediate auditory feedback.”
The researchers used electromagnetic articulation recording (EMA) to track jaw and tongue tip movements during speech in people with normal hearing under two conditions. That is, when the conversation can be heard, and when the conversation is masked by the noise of multiple speakers.
In the latter scenario, participants were temporarily unable to hear themselves and their speech motor skills were impaired.
This finding has important implications for understanding speech production in people with hearing loss, especially those with cochlear implants.
“Even years after implantation, some aspects of speech production remain impaired, no doubt because the auditory signals obtained through the CI have deteriorated,” Masapolo said. Ta.
Understanding how bad sounds affect speech can help confirm the effectiveness of cochlear implants and guide how to help children with profound hearing loss learn to speak, study says they pointed out.
Professor Masapolo is currently collaborating with Professor Susan Nittlauer, McGill Professor David J. Ostley, and Professor Lucie Ménard to study how reduced access to sound with cochlear implants can improve the sound produced by individuals who have received cochlear implants. We are investigating what kind of impact it has.
Preliminary findings suggest that people with hearing loss may rely more on mouth and tongue sensations than on auditory feedback to control speech movements.
If this is confirmed, clinical research could take advantage of this by developing new therapeutic interventions focused on oral motor training to assist children and adults with hearing loss.
About this auditory neuroscience research news
Author: Claire Rowen
Source: McGill University
Contact: Claire Loewen – McGill University
Image: Image credited to Neuroscience News
Original research: Closed access.
“Immediate auditory feedback modulates speech coordination between articulators to match speech structure,” by Matthew Masaporo et al., Journal of the Acoustical Society of America
abstract
Immediate auditory feedback adjusts vocal coordination between articulatory organs to match speech structure
Research has shown that speakers reliably adjust the timing of articulatory organ movements to variations in speech rate and syllable stress, and that the precision of this timing between articulators influences the phonetic structure of the resulting acoustic signal. I know it’s taking shape.
We tested here the hypothesis that immediate auditory feedback helps modulate that consistent articulatory timing control.
Speakers with normal hearing use electromagnetic articulatory recording to produce the alternate V (/ɑ/-/ɛ/) and C (/t/-/d) with varying speech rates (fast-normal). /) to record 480 /tV#Cat/ utterances. and stress (first syllable stress-unstress). Utterances were divided into two listening conditions: without mask and with mask.
To quantify the effect of immediate auditory feedback on coordination between the jaw and tongue tip, the timing of C’s tongue tip raise onset relative to V’s jaw opening and closing cycle was obtained in each listening condition.
Across both listening conditions, manipulations that shortened the jaw opening/closing cycle shortened the latency of the onset of tongue tip movements compared to the onset of jaw opening. Furthermore, tongue tip latency was strongly related to speech type.
However, during auditory masking, tongue tip latency was not as strongly related to speech type. This indicates that speakers use afferent auditory signals in real time to adjust the precision of timing between articulators to suit speech structure.