IDEAS home Printed from https://ideas.repec.org/a/plo/pcbi00/1003250.html
   My bibliography  Save this article

The Lazy Visual Word Form Area: Computational Insights into Location-Sensitivity

Author

Listed:
  • Thomas Hannagan
  • Jonathan Grainger

Abstract

In a recent study, Rauschecker et al. convincingly demonstrate that visual words evoke neural activation signals in the Visual Word Form Area that can be classified based on where they were presented in the visual fields. This result goes against the prevailing consensus, and begs an explanation. We show that one of the simplest possible models for word recognition, a multilayer feedforward network, will exhibit precisely the same behavior when trained to recognize words at different locations. The model suggests that the VWFA initially starts with information about location, which is not being suppressed during reading acquisition more than is needed to meet the requirements of location-invariant word recognition. Some new interpretations of Rauschecker et al.'s results are proposed, and three specific predictions are derived to be tested in further studies.Author Summary: There is a mild form of modern “mind-reading” that involves, with heavy fMRI apparatus and software assistance, to guess from brain signals alone the locations of words that have been seen by a (consenting) subject. The recent surprise brought to us by Rauschecker et al. is not that we can currently do that, but that we can do it in a brain region that had until now been largely taken to discard information pertaining to location — the so-called Visual Word Form Area (VWFA). The contribution of our article is to explain this phenomenon in a principled manner, using computational modeling. The gist of our account is that the VWFA starts out with location information, which is indeed progressively discarded as the region maturates but only in as much as actually required to recognize words presented at different retinal locations (a necessary feat when one learns how to read). This “lazy VWFA” account captures many of the findings reported by Rauschecker et al. in a simple model with very few parameters, and it makes specific predictions that would falsify the model immediately were they to be found incorrect.

Suggested Citation

  • Thomas Hannagan & Jonathan Grainger, 2013. "The Lazy Visual Word Form Area: Computational Insights into Location-Sensitivity," PLOS Computational Biology, Public Library of Science, vol. 9(10), pages 1-12, October.
  • Handle: RePEc:plo:pcbi00:1003250
    DOI: 10.1371/journal.pcbi.1003250
    as

    Download full text from publisher

    File URL: https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1003250
    Download Restriction: no

    File URL: https://journals.plos.org/ploscompbiol/article/file?id=10.1371/journal.pcbi.1003250&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pcbi.1003250?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pcbi00:1003250. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: ploscompbiol (email available below). General contact details of provider: https://journals.plos.org/ploscompbiol/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.