IDEAS home Printed from https://ideas.repec.org/a/nat/nature/v421y2003i6926d10.1038_421911a.html
   My bibliography  Save this article

Implicit estimation of sound-arrival time

Author

Listed:
  • Yoichi Sugita

    (National Institute of Advanced Industrial Science and Technology, Neuroscience Research Institute)

  • Yôiti Suzuki

    (Research Institute of Electrical Communication and Graduate School of Information Sciences, Tohoku University)

Abstract

In perceiving the sound produced by the movement of a visible object, the brain coordinates the auditory and visual input1,2,3 so that no delay is noticed even though the sound arrives later (for distant source objects, such as aircraft or firework displays, this is less effective). Here we show that coordination occurs because the brain uses information about distance that is supplied by the visual system to calibrate simultaneity. Our findings indicate that auditory and visual inputs are coordinated not because the brain has a wide temporal window for auditory integration, as was previously thought, but because the brain actively changes the temporal location of the window depending on the distance of the visible sound source.

Suggested Citation

  • Yoichi Sugita & Yôiti Suzuki, 2003. "Implicit estimation of sound-arrival time," Nature, Nature, vol. 421(6926), pages 911-911, February.
  • Handle: RePEc:nat:nature:v:421:y:2003:i:6926:d:10.1038_421911a
    DOI: 10.1038/421911a
    as

    Download full text from publisher

    File URL: https://www.nature.com/articles/421911a
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1038/421911a?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Jean-Rémy Martin & Anne Kösem & Virginie van Wassenhove, 2015. "Hysteresis in Audiovisual Synchrony Perception," PLOS ONE, Public Library of Science, vol. 10(3), pages 1-13, March.
    2. Renan Schiavolin Recio & André Mascioli Cravo & Raphael Yokoingawa de Camargo & Virginie van Wassenhove, 2019. "Dissociating the sequential dependency of subjective temporal order from subjective simultaneity," PLOS ONE, Public Library of Science, vol. 14(10), pages 1-10, October.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:nature:v:421:y:2003:i:6926:d:10.1038_421911a. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.