Engaging the articulators enhances perception of concordant visible speech movements
Date Issued
2019-10-25Publisher Version
10.1044/2019_JSLHR-S-19-0167Author(s)
Masapollo, Matthew
Guenther, Frank H.
Metadata
Show full item recordPermanent Link
https://hdl.handle.net/2144/40878Version
Accepted manuscript
Citation (published version)
Matthew Masapollo, Frank H Guenther. 2019. "Engaging the articulators enhances perception of concordant visible speech movements." J Speech Lang Hear Res, Volume 62, Issue 10, pp. 3679 - 3688. https://doi.org/10.1044/2019_JSLHR-S-19-0167Abstract
PURPOSE
This study aimed to test whether (and how) somatosensory feedback signals from the vocal tract affect concurrent unimodal visual speech perception.
METHOD
Participants discriminated pairs of silent visual utterances of vowels under 3 experimental conditions: (a) normal (baseline) and while holding either (b) a bite block or (c) a lip tube in their mouths. To test the specificity of somatosensory-visual interactions during perception, we assessed discrimination of vowel contrasts optically distinguished based on their mandibular (English /ɛ/-/æ/) or labial (English /u/-French /u/) postures. In addition, we assessed perception of each contrast using dynamically articulating videos and static (single-frame) images of each gesture (at vowel midpoint).
RESULTS
Engaging the jaw selectively facilitated perception of the dynamic gestures optically distinct in terms of jaw height, whereas engaging the lips selectively facilitated perception of the dynamic gestures optically distinct in terms of their degree of lip compression and protrusion. Thus, participants perceived visible speech movements in relation to the configuration and shape of their own vocal tract (and possibly their ability to produce covert vowel production-like movements). In contrast, engaging the articulators had no effect when the speaking faces did not move, suggesting that the somatosensory inputs affected perception of time-varying kinematic information rather than changes in target (movement end point) mouth shapes.
CONCLUSIONS
These findings suggest that orofacial somatosensory inputs associated with speech production prime premotor and somatosensory brain regions involved in the sensorimotor control of speech, thereby facilitating perception of concordant visible speech movements.
SUPPLEMENTAL MATERIAL
https://doi.org/10.23641/asha.9911846