IDEAS home Printed from https://ideas.repec.org/a/nat/nature/v395y1998i6704d10.1038_27435.html
   My bibliography  Save this article

A jitter after-effect reveals motion-based stabilization of vision

Author

Listed:
  • Ikuya Murakami

    (Harvard University)

  • Patrick Cavanagh

    (Harvard University)

Abstract

A shaky hand holding a video camera invariably turns a treasured moment into an annoying, jittery momento. More recent consumer cameras thoughtfully offer stabilization mechanisms to compensate for our unsteady grip. Our eyes face a similar challenge in that they are constantly making small movements even when we try to maintain a fixed gaze1. What should be substantial, distracting jitter passes completely unseen. Position changes from large eye movements (saccades) seem to be corrected on the basis of extraretinal signals such as the motor commands sent to the eye muscle2,3,4,5, and the resulting motion responses seem to be simply switched off6,7. But this approach is impracticable for incessant, small displacements, and here we describe a novel visual illusion that reveals a compensation mechanism based on visual motion signals. Observers were adapted to a patch of dynamic random noise and then viewed a larger pattern of static random noise. The static noise in the unadapted regions then appeared to ‘jitter’ coherently in random directions. Several observations indicate that this visual jitter directly reflects fixational eye movements. We propose a model that accounts for this illusion as well as the stability of the visual world during small and/or slow eye movements such as fixational drift, smooth pursuit and low-amplitude mechanical vibrations of the eyes.

Suggested Citation

  • Ikuya Murakami & Patrick Cavanagh, 1998. "A jitter after-effect reveals motion-based stabilization of vision," Nature, Nature, vol. 395(6704), pages 798-801, October.
  • Handle: RePEc:nat:nature:v:395:y:1998:i:6704:d:10.1038_27435
    DOI: 10.1038/27435
    as

    Download full text from publisher

    File URL: https://www.nature.com/articles/27435
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1038/27435?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Zhetuo Zhao & Ehud Ahissar & Jonathan D. Victor & Michele Rucci, 2023. "Inferring visual space from ultra-fine extra-retinal knowledge of gaze position," Nature Communications, Nature, vol. 14(1), pages 1-12, December.
    2. Eric G. Wu & Nora Brackbill & Colleen Rhoades & Alexandra Kling & Alex R. Gogliettino & Nishal P. Shah & Alexander Sher & Alan M. Litke & Eero P. Simoncelli & E. J. Chichilnisky, 2024. "Fixational eye movements enhance the precision of visual information transmitted by the primate retina," Nature Communications, Nature, vol. 15(1), pages 1-15, December.
    3. Xaq Pitkow & Haim Sompolinsky & Markus Meister, 2007. "A Neural Computation for Visual Acuity in the Presence of Eye Movements," PLOS Biology, Public Library of Science, vol. 5(12), pages 1-14, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:nature:v:395:y:1998:i:6704:d:10.1038_27435. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.