IDEAS home Printed from https://ideas.repec.org/a/bla/jinfst/v72y2021i2p141-155.html
   My bibliography  Save this article

Do better search engines really equate to better clinical decisions? If not, why not?

Author

Listed:
  • Anton van der Vegt
  • Guido Zuccon
  • Bevan Koopman

Abstract

Previous research has found that improved search engine effectiveness—evaluated using a batch‐style approach—does not always translate to significant improvements in user task performance; however, these prior studies focused on simple recall and precision‐based search tasks. We investigated the same relationship, but for realistic, complex search tasks required in clinical decision making. One hundred and nine clinicians and final year medical students answered 16 clinical questions. Although the search engine did improve answer accuracy by 20 percentage points, there was no significant difference when participants used a more effective, state‐of‐the‐art search engine. We also found that the search engine effectiveness difference, identified in the lab, was diminished by around 70% when the search engines were used with real users. Despite the aid of the search engine, half of the clinical questions were answered incorrectly. We further identified the relative contribution of search engine effectiveness to the overall end task success. We found that the ability to interpret documents correctly was a much more important factor impacting task success. If these findings are representative, information retrieval research may need to reorient its emphasis towards helping users to better understand information, rather than just finding it for them.

Suggested Citation

  • Anton van der Vegt & Guido Zuccon & Bevan Koopman, 2021. "Do better search engines really equate to better clinical decisions? If not, why not?," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 72(2), pages 141-155, February.
  • Handle: RePEc:bla:jinfst:v:72:y:2021:i:2:p:141-155
    DOI: 10.1002/asi.24398
    as

    Download full text from publisher

    File URL: https://doi.org/10.1002/asi.24398
    Download Restriction: no

    File URL: https://libkey.io/10.1002/asi.24398?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Tefko Saracevic, 1975. "RELEVANCE: A review of and a framework for the thinking on the notion in information science," Journal of the American Society for Information Science, Association for Information Science & Technology, vol. 26(6), pages 321-343, November.
    2. William Hersh & Jeffrey Pentecost & David Hickam, 1996. "A task‐oriented approach to information retrieval evaluation," Journal of the American Society for Information Science, Association for Information Science & Technology, vol. 47(1), pages 50-56, January.
    3. William Hersh, 1994. "Relevance and retrieval evaluation: Perspectives from medicine," Journal of the American Society for Information Science, Association for Information Science & Technology, vol. 45(3), pages 201-206, April.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Ashraf Labib & Salem Chakhar & Lorraine Hope & John Shimell & Mark Malinowski, 2022. "Analysis of noise and bias errors in intelligence information systems," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 73(12), pages 1755-1775, December.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Xiaoli Huang & Dagobert Soergel, 2013. "Relevance: An improved framework for explicating the notion," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(1), pages 18-35, January.
    2. Pertti Vakkari & Michael Völske & Martin Potthast & Matthias Hagen & Benno Stein, 2021. "Predicting essay quality from search and writing behavior," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 72(7), pages 839-852, July.
    3. Gineke Wiggers & Suzan Verberne & Wouter van Loon & Gerrit‐Jan Zwenne, 2023. "Bibliometric‐enhanced legal information retrieval: Combining usage and citations as flavors of impact relevance," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(8), pages 1010-1025, August.
    4. Malcolm Dow & Peter Willett & Roderick McDonald & Belver Griffith & Michael Greenacre & Peter Bryant & Daniel Wartenberg & Ove Frank, 1987. "Book reviews," Journal of Classification, Springer;The Classification Society, vol. 4(2), pages 245-278, September.
    5. Johanna I. Westbrook & A. Sophie Gosling & Enrico W. Coiera, 2005. "The Impact of an Online Evidence System on Confidence in Decision Making in a Controlled Setting," Medical Decision Making, , vol. 25(2), pages 178-185, March.
    6. Antonio Maria Rinaldi & Cristiano Russo & Cristian Tommasino, 2020. "A Knowledge-Driven Multimedia Retrieval System Based on Semantics and Deep Features," Future Internet, MDPI, vol. 12(11), pages 1-20, October.
    7. Jingfei Li & Peng Zhang & Dawei Song & Yue Wu, 2017. "Understanding an enriched multidimensional user relevance model by analyzing query logs," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 68(12), pages 2743-2754, December.
    8. Howard D. White, 2010. "Some new tests of relevance theory in information science," Scientometrics, Springer;Akadémiai Kiadó, vol. 83(3), pages 653-667, June.
    9. Dietmar Wolfram, 2015. "The symbiotic relationship between information retrieval and informetrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(3), pages 2201-2214, March.
    10. Anton Oleinik, 2022. "Relevance in Web search: between content, authority and popularity," Quality & Quantity: International Journal of Methodology, Springer, vol. 56(1), pages 173-194, February.
    11. Aurora González-Teruel & Gregorio González-Alcaide & Maite Barrios & María-Francisca Abad-García, 2015. "Mapping recent information behavior research: an analysis of co-authorship and co-citation networks," Scientometrics, Springer;Akadémiai Kiadó, vol. 103(2), pages 687-705, May.
    12. Olof Sundin & Dirk Lewandowski & Jutta Haider, 2022. "Whose relevance? Web search engines as multisided relevance machines," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 73(5), pages 637-642, May.
    13. P. K. Paul & A. Bhuimali & R. Rajesh & K. L. Dangwal & P. Das & J. Ganguly, 2016. "Green Computing for Eco Enriched Information Services and Systems: Environmental & Bio Informatics Perspective," Journal of Biotechnology Research, Academic Research Publishing Group, vol. 2(6), pages 44-48, 06-2016.
    14. Adan Ortiz-Cordova & Bernard J. Jansen, 2012. "Classifying web search queries to identify high revenue generating customers," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(7), pages 1426-1441, July.
    15. Nathalie Demoulin & Kristof Coussement, 2018. "Acceptance of text-mining systems: The signaling role of information quality," Post-Print hal-02111772, HAL.
    16. Peter Mutschke & Philipp Mayr, 2015. "Science models for search: a study on combining scholarly information retrieval and scientometrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(3), pages 2323-2345, March.
    17. Lutz Bornmann & Loet Leydesdorff, 2020. "Historical roots of Judit Bar-Ilan’s research: a cited-references analysis using CRExplorer," Scientometrics, Springer;Akadémiai Kiadó, vol. 123(3), pages 1193-1200, June.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bla:jinfst:v:72:y:2021:i:2:p:141-155. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: http://www.asis.org .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.