IDEAS home Printed from https://ideas.repec.org/a/plo/pcbi00/1006563.html
   My bibliography  Save this article

Independent working memory resources for egocentric and allocentric spatial information

Author

Listed:
  • David Aagten-Murphy
  • Paul M Bays

Abstract

Visuospatial working memory enables us to maintain access to visual information for processing even when a stimulus is no longer present, due to occlusion, our own movements, or transience of the stimulus. Here we show that, when localizing remembered stimuli, the precision of spatial recall does not rely solely on memory for individual stimuli, but additionally depends on the relative distances between stimuli and visual landmarks in the surroundings. Across three separate experiments, we consistently observed a spatially selective improvement in the precision of recall for items located near a persistent landmark. While the results did not require that the landmark be visible throughout the memory delay period, it was essential that it was visible both during encoding and response. We present a simple model that can accurately capture human performance by considering relative (allocentric) spatial information as an independent localization estimate which degrades with distance and is optimally integrated with egocentric spatial information. Critically, allocentric information was encoded without cost to egocentric estimation, demonstrating independent storage of the two sources of information. Finally, when egocentric and allocentric estimates were put in conflict, the model successfully predicted the resulting localization errors. We suggest that the relative distance between stimuli represents an additional, independent spatial cue for memory recall. This cue information is likely to be critical for spatial localization in natural settings which contain an abundance of visual landmarks.Author summary: Human capacity to maintain spatial information over brief interruptions is strongly limited. However, while studies of visual working memory typically examine recall in sparse displays, consisting only of the stimuli to remember, natural scenes are commonly filled with other objects that—although not required to be remembered—may nevertheless influence subsequent localization. We demonstrate that memory for spatial location depends on independent stores for egocentric (relative to the observer) and allocentric (relative to other stimuli) information about object position. Both types of spatial representation become increasingly imprecise as the number of objects in memory increases. However, even when visual landmarks are present—and allocentric information encoded—there is no change in egocentric precision. This suggests that the encoding of additional allocentric spatial information does not compete for working memory resources with egocentric spatial information. Additionally, the fidelity of allocentric position information diminished rapidly with distance, resulting in a spatially specific advantage for recall of objects in the vicinity of stable landmarks. The effect of a landmark on recall matches that of an ideal observer who optimally combines egocentric and allocentric cues. This work provides a new experimental and theoretical framework for the investigation of spatial memory mechanisms.

Suggested Citation

  • David Aagten-Murphy & Paul M Bays, 2019. "Independent working memory resources for egocentric and allocentric spatial information," PLOS Computational Biology, Public Library of Science, vol. 15(2), pages 1-20, February.
  • Handle: RePEc:plo:pcbi00:1006563
    DOI: 10.1371/journal.pcbi.1006563
    as

    Download full text from publisher

    File URL: https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1006563
    Download Restriction: no

    File URL: https://journals.plos.org/ploscompbiol/article/file?id=10.1371/journal.pcbi.1006563&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pcbi.1006563?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Steven J. Luck & Edward K. Vogel, 1997. "The capacity of visual working memory for features and conjunctions," Nature, Nature, vol. 390(6657), pages 279-281, November.
    2. Weiwei Zhang & Steven J. Luck, 2008. "Discrete fixed-resolution representations in visual working memory," Nature, Nature, vol. 453(7192), pages 233-235, May.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Yuri A. Markov & Natalia A. Tiurina & Igor S. Utochkin, 2018. "Different features are stored independently in visual working memory but mediated by object-based representations," HSE Working papers WP BRP 101/PSY/2018, National Research University Higher School of Economics.
    2. Yuri A. Markov & Igor S. Utochkin, 2017. "The Effect of Object Distinctiveness on Object-Location Binding in Visual Working Memory," HSE Working papers WP BRP 79/PSY/2017, National Research University Higher School of Economics.
    3. J David Timm & Frank Papenmeier, 2019. "Reorganization of spatial configurations in visual working memory: A matter of set size?," PLOS ONE, Public Library of Science, vol. 14(11), pages 1-16, November.
    4. Shaiyan Keshvari & Ronald van den Berg & Wei Ji Ma, 2013. "No Evidence for an Item Limit in Change Detection," PLOS Computational Biology, Public Library of Science, vol. 9(2), pages 1-9, February.
    5. Ken McAnally & Russell Martin, 2016. "Modelling Visual Change Detection and Identification under Free Viewing Conditions," PLOS ONE, Public Library of Science, vol. 11(2), pages 1-16, February.
    6. Jack Phu & Michael Kalloniatis & Sieu K Khuu, 2016. "The Effect of Attentional Cueing and Spatial Uncertainty in Visual Field Testing," PLOS ONE, Public Library of Science, vol. 11(3), pages 1-18, March.
    7. Haggar Cohen-Dallal & Isaac Fradkin & Yoni Pertzov, 2018. "Are stronger memories forgotten more slowly? No evidence that memory strength influences the rate of forgetting," PLOS ONE, Public Library of Science, vol. 13(7), pages 1-18, July.
    8. Loic Matthey & Paul M Bays & Peter Dayan, 2015. "A Probabilistic Palimpsest Model of Visual Short-term Memory," PLOS Computational Biology, Public Library of Science, vol. 11(1), pages 1-34, January.
    9. David W Sutterer & Joshua J Foster & Kirsten C S Adam & Edward K Vogel & Edward Awh, 2019. "Item-specific delay activity demonstrates concurrent storage of multiple active neural representations in working memory," PLOS Biology, Public Library of Science, vol. 17(4), pages 1-25, April.
    10. Mohammad Zia Ul Haq Katshu & Giovanni d'Avossa, 2014. "Fine-Grained, Local Maps and Coarse, Global Representations Support Human Spatial Working Memory," PLOS ONE, Public Library of Science, vol. 9(9), pages 1-13, September.
    11. Igor S. Utochkin & Vladislav A. Khvostov & Yulia M. Stakina, 2017. "Ensemble-Based Segmentation in the Perception of Multiple Feature Conjunctions," HSE Working papers WP BRP 78/PSY/2017, National Research University Higher School of Economics.
    12. Jastrzębski, Jan & Ciechanowska, Iwona & Chuderski, Adam, 2018. "The strong link between fluid intelligence and working memory cannot be explained away by strategy use," Intelligence, Elsevier, vol. 66(C), pages 44-53.
    13. Aki Kondo & Jun Saiki, 2012. "Feature-Specific Encoding Flexibility in Visual Working Memory," PLOS ONE, Public Library of Science, vol. 7(12), pages 1-8, December.
    14. Hongwei Tan & Sebastiaan van Dijken, 2023. "Dynamic machine vision with retinomorphic photomemristor-reservoir computing," Nature Communications, Nature, vol. 14(1), pages 1-9, December.
    15. Robert W. Faff & Sebastian Kernbach, 2021. "A visualisation approach for pitching research," Accounting and Finance, Accounting and Finance Association of Australia and New Zealand, vol. 61(4), pages 5177-5197, December.
    16. Tullo, Domenico & Faubert, Jocelyn & Bertone, Armando, 2018. "The characterization of attention resource capacity and its relationship with fluid reasoning intelligence: A multiple object tracking study," Intelligence, Elsevier, vol. 69(C), pages 158-168.
    17. Jifan Zhou & Jun Yin & Tong Chen & Xiaowei Ding & Zaifeng Gao & Mowei Shen, 2011. "Visual Working Memory Capacity Does Not Modulate the Feature-Based Information Filtering in Visual Working Memory," PLOS ONE, Public Library of Science, vol. 6(9), pages 1-10, September.
    18. Nathaniel J. S. Ashby & Stephan Dickert & Andreas Glockner, 2012. "Focusing on what you own: Biased information uptake due to ownership," Judgment and Decision Making, Society for Judgment and Decision Making, vol. 7(3), pages 254-267, May.
    19. Lior Fink & Daniele Papismedov, 2023. "On the Same Page? What Users Benefit from a Desktop View on Mobile Devices," Information Systems Research, INFORMS, vol. 34(2), pages 423-441, June.
    20. Li, Qian & Huang, Zhuowei (Joy) & Christianson, Kiel, 2016. "Visual attention toward tourism photographs with text: An eye-tracking study," Tourism Management, Elsevier, vol. 54(C), pages 243-258.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pcbi00:1006563. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: ploscompbiol (email available below). General contact details of provider: https://journals.plos.org/ploscompbiol/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.