Show simple item record

dc.contributor.authorMasapollo, Matthewen_US
dc.contributor.authorGuenther, Frank H.en_US
dc.coverage.spatialUnited Statesen_US
dc.date.accessioned2020-05-14T18:44:56Z
dc.date.available2020-05-14T18:44:56Z
dc.date.issued2019-10-25
dc.identifierhttps://www.ncbi.nlm.nih.gov/pubmed/31577522
dc.identifier.citationMatthew Masapollo, Frank H Guenther. 2019. "Engaging the articulators enhances perception of concordant visible speech movements." J Speech Lang Hear Res, Volume 62, Issue 10, pp. 3679 - 3688. https://doi.org/10.1044/2019_JSLHR-S-19-0167
dc.identifier.issn1558-9102
dc.identifier.urihttps://hdl.handle.net/2144/40878
dc.description.abstractPURPOSE This study aimed to test whether (and how) somatosensory feedback signals from the vocal tract affect concurrent unimodal visual speech perception. METHOD Participants discriminated pairs of silent visual utterances of vowels under 3 experimental conditions: (a) normal (baseline) and while holding either (b) a bite block or (c) a lip tube in their mouths. To test the specificity of somatosensory-visual interactions during perception, we assessed discrimination of vowel contrasts optically distinguished based on their mandibular (English /ɛ/-/æ/) or labial (English /u/-French /u/) postures. In addition, we assessed perception of each contrast using dynamically articulating videos and static (single-frame) images of each gesture (at vowel midpoint). RESULTS Engaging the jaw selectively facilitated perception of the dynamic gestures optically distinct in terms of jaw height, whereas engaging the lips selectively facilitated perception of the dynamic gestures optically distinct in terms of their degree of lip compression and protrusion. Thus, participants perceived visible speech movements in relation to the configuration and shape of their own vocal tract (and possibly their ability to produce covert vowel production-like movements). In contrast, engaging the articulators had no effect when the speaking faces did not move, suggesting that the somatosensory inputs affected perception of time-varying kinematic information rather than changes in target (movement end point) mouth shapes. CONCLUSIONS These findings suggest that orofacial somatosensory inputs associated with speech production prime premotor and somatosensory brain regions involved in the sensorimotor control of speech, thereby facilitating perception of concordant visible speech movements. SUPPLEMENTAL MATERIAL https://doi.org/10.23641/asha.9911846en_US
dc.description.sponsorshipR01 DC002852 - NIDCD NIH HHSen_US
dc.format.extentp. 3679 - 3688en_US
dc.languageeng
dc.language.isoen_US
dc.publisherAmerican Speech-Language-Hearing Associationen_US
dc.relation.ispartofJournal of Speech, Language, and Hearing Research
dc.subjectClinical sciencesen_US
dc.subjectCognitive sciencesen_US
dc.subjectLinguisticsen_US
dc.subjectSpeech-language pathology & audiologyen_US
dc.titleEngaging the articulators enhances perception of concordant visible speech movementsen_US
dc.typeArticleen_US
dc.description.versionAccepted manuscripten_US
dc.identifier.doi10.1044/2019_JSLHR-S-19-0167
pubs.elements-sourcepubmeden_US
pubs.notesEmbargo: Not knownen_US
pubs.organisational-groupBoston Universityen_US
pubs.organisational-groupBoston University, College of Health & Rehabilitation Sciences: Sargent Collegeen_US
pubs.organisational-groupBoston University, College of Health & Rehabilitation Sciences: Sargent College, Speech, Language & Hearing Sciencesen_US
pubs.publication-statusPublisheden_US
dc.identifier.mycv488600


This item appears in the following Collection(s)

Show simple item record