Show simple item record

dc.contributor.authorCalabro, Finnegan J.en_US
dc.contributor.authorSoto-Faraco, S.en_US
dc.contributor.authorVaina, Lucia M.en_US
dc.date.accessioned2020-05-19T14:00:40Z
dc.date.available2020-05-19T14:00:40Z
dc.date.issued2011-09-22
dc.identifierhttp://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&KeyUT=WOS:000293733600019&DestLinkType=FullRecord&DestApp=ALL_WOS&UsrCustomerID=6e74115fe3da270499c3d65c9b17d654
dc.identifier.citationF.J. Calabro, S. Soto-Faraco, L.M. Vaina. 2011. "Acoustic facilitation of object movement detection during self-motion." Proceedings of the Royal Society B: Biological Sciences, Volume 278, Issue 1719, pp. 2840 - 2847. https://doi.org/10.1098/rspb.2010.2757
dc.identifier.issn0962-8452
dc.identifier.issn1471-2954
dc.identifier.urihttps://hdl.handle.net/2144/41008
dc.descriptionTo retrieve the figures described in this paper, please refer to the published version: https://doi.org/10.1098/rspb.2010.2757.en_US
dc.description.abstractIn humans, as well as most animal species, perception of object motion is critical to successful interaction with the surrounding environment. Yet, as the observer also moves, the retinal projections of the various motion components add to each other and extracting accurate object motion becomes computationally challenging. Recent psychophysical studies have demonstrated that observers use a flow-parsing mechanism to estimate and subtract self-motion from the optic flow field. We investigated whether concurrent acoustic cues for motion can facilitate visual flow parsing, thereby enhancing the detection of moving objects during simulated self-motion. Participants identified an object (the target) that moved either forward or backward within a visual scene containing nine identical textured objects simulating forward observer translation. We found that spatially co-localized, directionally congruent, moving auditory stimuli enhanced object motion detection. Interestingly, subjects who performed poorly on the visual-only task benefited more from the addition of moving auditory stimuli. When auditory stimuli were not co-localized to the visual target, improvements in detection rates were weak. Taken together, these results suggest that parsing object motion from self-motion-induced optic flow can operate on multisensory object representations.en_US
dc.description.sponsorshipL.M.V. and F.J.C. were supported by NIH grant RO1NS064100 to L.M.V. S.S.F. was supported by grants from the Ministerio de Ciencia e Innovacion (PSI2010-15426 and CSD2007-00012), by the Comissionat per a Universitats i Recerca del DIUE (SRG2009-092) and by the European Research Council (StG-2010 263145).We thank Gerald Kidd and Chris Mason for their helpful suggestions and for generously making available to us the resources of the Sound Field Laboratory at Sargent College, Boston University, supported by grant P30 DC04663. We also thank Franco Rupcich, Benvy Caldwell and Megan Menard for helping with psychophysical data collection and subject recruitment, and Leonardo Sassi for developing and implementing a preliminary version of the object motion task. (RO1NS064100 - NIH; PSI2010-15426 - Ministerio de Ciencia e Innovacion; CSD2007-00012 - Ministerio de Ciencia e Innovacion; SRG2009-092 - Comissionat per a Universitats i Recerca del DIUE; StG-2010 263145 - European Research Council; P30 DC04663 - Sound Field Laboratory at Sargent College, Boston University; ICREA)en_US
dc.format.extentp. 2840 - 2847en_US
dc.languageEnglish
dc.language.isoen_US
dc.publisherThe Royal Societyen_US
dc.relation.ispartofProceedings of the Royal Society B: Biological Sciences
dc.subjectScience & technologyen_US
dc.subjectLife sciences & biomedicineen_US
dc.subjectBiologyen_US
dc.subjectEcologyen_US
dc.subjectEvolutionary biologyen_US
dc.subjectLife sciences & biomedicine - other topicsen_US
dc.subjectEnvironmental sciences & ecologyen_US
dc.subjectFlow parsingen_US
dc.subjectVisual searchen_US
dc.subjectMultisensory perceptionen_US
dc.subjectVisual motionen_US
dc.subjectAuditory motionen_US
dc.subjectSensory modalitiesen_US
dc.subjectRetinal motionen_US
dc.subjectAudiovisual integrationen_US
dc.subjectPerceptionen_US
dc.subjectVisionen_US
dc.subjectInformationen_US
dc.subjectParallelen_US
dc.subjectCaptureen_US
dc.subjectAcoustic stimulationen_US
dc.subjectAdulten_US
dc.subjectHumansen_US
dc.subjectMaleen_US
dc.subjectMotion perceptionen_US
dc.subjectMovementen_US
dc.subjectPhotic stimulationen_US
dc.subjectRetinaen_US
dc.subjectVisual perceptionen_US
dc.subjectYoung Adulten_US
dc.subjectBiological sciencesen_US
dc.subjectMedical and health sciencesen_US
dc.subjectAgricultural and veterinary sciencesen_US
dc.titleAcoustic facilitation of object movement detection during self-motionen_US
dc.typeArticleen_US
dc.description.versionAccepted manuscripten_US
dc.identifier.doi10.1098/rspb.2010.2757
pubs.elements-sourceweb-of-scienceen_US
pubs.notesEmbargo: Not knownen_US
pubs.organisational-groupBoston Universityen_US
pubs.organisational-groupBoston University, College of Engineeringen_US
pubs.organisational-groupBoston University, College of Engineering, Department of Biomedical Engineeringen_US
pubs.publication-statusPublisheden_US
dc.identifier.orcid0000-0002-5636-8352 (Vaina, LM)
dc.identifier.mycv68308


This item appears in the following Collection(s)

Show simple item record