Show simple item record

dc.contributor.advisorTager-Flusberg, Helenen_US
dc.contributor.authorFinch, Kaylaen_US
dc.date.accessioned2019-12-13T16:48:59Z
dc.date.issued2019
dc.identifier.urihttps://hdl.handle.net/2144/38792
dc.description.abstractLanguage is a multimodal process with visual and auditory cues playing important roles in understanding speech. A well-controlled paradigm with audiovisually matched and mismatched syllables is often used to capture audiovisual (AV) speech processing. The ability to detect and integrate mismatching cues shows large individual variability across development and is linked to later language in typical development (TD) and social abilities in autism spectrum disorder (ASD). However, no study has used a multimethod approach to better understand AV speech processing in early development. The studies’ aims were to examine behavioral performance, gaze patterns, and neural indices of AV speech in: 1) TD preschoolers (N=60; females=35) and 2) infants at risk for developing ASD (high-risk, HR; N=37; females=10) and TD controls (low-risk, LR; N=42; females=21). In Study 1, I investigated preschoolers’ gaze patterns and behavioral performance when presented with matched and mismatched AV speech and visual-only (lipreading) speech. As hypothesized, lipreading abilities were associated with children’s ability to integrate mismatching AV cues, and children looked towards the mouth when visual cues were helpful, specifically in lipreading conditions. Unexpectedly, looking time towards the mouth was not associated with the children’s ability to integrate mismatching AV cues. Study 2 examined how visual cues of AV speech modulated auditory event-related potentials (ERPs), and associations between ERPs and preschoolers’ behavioral performance during an AV speech task. As hypothesized, the auditory ERPs were attenuated during AV speech compared to auditory-only speech. Additionally, individual differences in their neural processing of auditory and visual cues predicted which cue the child attended to in mismatched AV speech. In Study 3, I investigated ERPs of AV speech in LR and HR 12-month-olds and their association with language abilities at 18-months. Unexpectedly, I found no group differences: all infants were able to detect mismatched AV speech as measured through a more negative ERP response. As hypothesized, more mature neural processing of AV speech integration, measured as a more positive ERP response to fusible AV cues, predicted later language across all infants. These results highlight the importance of using multimethod approaches to understand variability in AV speech processing at two developmental stages.en_US
dc.language.isoen_US
dc.subjectDevelopmental psychologyen_US
dc.subjectAudiovisual speechen_US
dc.subjectAutism spectrum disorderen_US
dc.subjectERPsen_US
dc.subjectEye-trackingen_US
dc.subjectLanguageen_US
dc.subjectMcGurk effecten_US
dc.titleNeural indices and looking behaviors of audiovisual speech processing in infancy and early childhooden_US
dc.typeThesis/Dissertationen_US
dc.date.updated2019-11-12T20:01:49Z
dc.description.embargo2021-11-12T00:00:00Z
etd.degree.nameDoctor of Philosophyen_US
etd.degree.leveldoctoralen_US
etd.degree.disciplinePsychological & Brain Sciencesen_US
etd.degree.grantorBoston Universityen_US
dc.identifier.orcid0000-0001-6357-8537


This item appears in the following Collection(s)

Show simple item record