Acoustic facilitation of object movement detection during self-motion
dc.contributor.author | Calabro, Finnegan J. | en_US |
dc.contributor.author | Soto-Faraco, S. | en_US |
dc.contributor.author | Vaina, Lucia M. | en_US |
dc.date.accessioned | 2020-05-19T14:00:40Z | |
dc.date.available | 2020-05-19T14:00:40Z | |
dc.date.issued | 2011-09-22 | |
dc.identifier | http://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&KeyUT=WOS:000293733600019&DestLinkType=FullRecord&DestApp=ALL_WOS&UsrCustomerID=6e74115fe3da270499c3d65c9b17d654 | |
dc.identifier.citation | F.J. Calabro, S. Soto-Faraco, L.M. Vaina. 2011. "Acoustic facilitation of object movement detection during self-motion." Proceedings of the Royal Society B: Biological Sciences, Volume 278, Issue 1719, pp. 2840 - 2847. https://doi.org/10.1098/rspb.2010.2757 | |
dc.identifier.issn | 0962-8452 | |
dc.identifier.issn | 1471-2954 | |
dc.identifier.uri | https://hdl.handle.net/2144/41008 | |
dc.description | To retrieve the figures described in this paper, please refer to the published version: https://doi.org/10.1098/rspb.2010.2757. | en_US |
dc.description.abstract | In humans, as well as most animal species, perception of object motion is critical to successful interaction with the surrounding environment. Yet, as the observer also moves, the retinal projections of the various motion components add to each other and extracting accurate object motion becomes computationally challenging. Recent psychophysical studies have demonstrated that observers use a flow-parsing mechanism to estimate and subtract self-motion from the optic flow field. We investigated whether concurrent acoustic cues for motion can facilitate visual flow parsing, thereby enhancing the detection of moving objects during simulated self-motion. Participants identified an object (the target) that moved either forward or backward within a visual scene containing nine identical textured objects simulating forward observer translation. We found that spatially co-localized, directionally congruent, moving auditory stimuli enhanced object motion detection. Interestingly, subjects who performed poorly on the visual-only task benefited more from the addition of moving auditory stimuli. When auditory stimuli were not co-localized to the visual target, improvements in detection rates were weak. Taken together, these results suggest that parsing object motion from self-motion-induced optic flow can operate on multisensory object representations. | en_US |
dc.description.sponsorship | L.M.V. and F.J.C. were supported by NIH grant RO1NS064100 to L.M.V. S.S.F. was supported by grants from the Ministerio de Ciencia e Innovacion (PSI2010-15426 and CSD2007-00012), by the Comissionat per a Universitats i Recerca del DIUE (SRG2009-092) and by the European Research Council (StG-2010 263145).We thank Gerald Kidd and Chris Mason for their helpful suggestions and for generously making available to us the resources of the Sound Field Laboratory at Sargent College, Boston University, supported by grant P30 DC04663. We also thank Franco Rupcich, Benvy Caldwell and Megan Menard for helping with psychophysical data collection and subject recruitment, and Leonardo Sassi for developing and implementing a preliminary version of the object motion task. (RO1NS064100 - NIH; PSI2010-15426 - Ministerio de Ciencia e Innovacion; CSD2007-00012 - Ministerio de Ciencia e Innovacion; SRG2009-092 - Comissionat per a Universitats i Recerca del DIUE; StG-2010 263145 - European Research Council; P30 DC04663 - Sound Field Laboratory at Sargent College, Boston University; ICREA) | en_US |
dc.format.extent | p. 2840 - 2847 | en_US |
dc.language | English | |
dc.language.iso | en_US | |
dc.publisher | The Royal Society | en_US |
dc.relation.ispartof | Proceedings of the Royal Society B: Biological Sciences | |
dc.subject | Science & technology | en_US |
dc.subject | Life sciences & biomedicine | en_US |
dc.subject | Biology | en_US |
dc.subject | Ecology | en_US |
dc.subject | Evolutionary biology | en_US |
dc.subject | Life sciences & biomedicine - other topics | en_US |
dc.subject | Environmental sciences & ecology | en_US |
dc.subject | Flow parsing | en_US |
dc.subject | Visual search | en_US |
dc.subject | Multisensory perception | en_US |
dc.subject | Visual motion | en_US |
dc.subject | Auditory motion | en_US |
dc.subject | Sensory modalities | en_US |
dc.subject | Retinal motion | en_US |
dc.subject | Audiovisual integration | en_US |
dc.subject | Perception | en_US |
dc.subject | Vision | en_US |
dc.subject | Information | en_US |
dc.subject | Parallel | en_US |
dc.subject | Capture | en_US |
dc.subject | Acoustic stimulation | en_US |
dc.subject | Adult | en_US |
dc.subject | Humans | en_US |
dc.subject | Male | en_US |
dc.subject | Motion perception | en_US |
dc.subject | Movement | en_US |
dc.subject | Photic stimulation | en_US |
dc.subject | Retina | en_US |
dc.subject | Visual perception | en_US |
dc.subject | Young Adult | en_US |
dc.subject | Biological sciences | en_US |
dc.subject | Medical and health sciences | en_US |
dc.subject | Agricultural and veterinary sciences | en_US |
dc.title | Acoustic facilitation of object movement detection during self-motion | en_US |
dc.type | Article | en_US |
dc.description.version | Accepted manuscript | en_US |
dc.identifier.doi | 10.1098/rspb.2010.2757 | |
pubs.elements-source | web-of-science | en_US |
pubs.notes | Embargo: Not known | en_US |
pubs.organisational-group | Boston University | en_US |
pubs.organisational-group | Boston University, College of Engineering | en_US |
pubs.organisational-group | Boston University, College of Engineering, Department of Biomedical Engineering | en_US |
pubs.publication-status | Published | en_US |
dc.identifier.orcid | 0000-0002-5636-8352 (Vaina, LM) | |
dc.identifier.mycv | 68308 |
This item appears in the following Collection(s)