IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0085508.html
   My bibliography  Save this article

Crowdsourcing Awareness: Exploration of the Ovarian Cancer Knowledge Gap through Amazon Mechanical Turk

Author

Listed:
  • Rebecca R Carter
  • Analisa DiFeo
  • Kath Bogie
  • Guo-Qiang Zhang
  • Jiayang Sun

Abstract

Background: Ovarian cancer is the most lethal gynecologic disease in the United States, with more women dying from this cancer than all gynecological cancers combined. Ovarian cancer has been termed the “silent killer” because some patients do not show clear symptoms at an early stage. Currently, there is a lack of approved and effective early diagnostic tools for ovarian cancer. There is also an apparent severe knowledge gap of ovarian cancer in general and of its indicative symptoms among both public and many health professionals. These factors have significantly contributed to the late stage diagnosis of most ovarian cancer patients (63% are diagnosed at Stage III or above), where the 5-year survival rate is less than 30%. The paucity of knowledge concerning ovarian cancer in the United States is unknown. Methods: The present investigation examined current public awareness and knowledge about ovarian cancer. The study implemented design strategies to develop an unbiased survey with quality control measures, including the modern application of multiple statistical analyses. The survey assessed a reasonable proxy of the US population by crowdsourcing participants through the online task marketplace Amazon Mechanical Turk, at a highly condensed rate of cost and time compared to traditional recruitment methods. Conclusion: Knowledge of ovarian cancer was compared to that of breast cancer using repeated measures, bias control and other quality control measures in the survey design. Analyses included multinomial logistic regression and categorical data analysis procedures such as correspondence analysis, among other statistics. We confirmed the relatively poor public knowledge of ovarian cancer among the US population. The simple, yet novel design should set an example for designing surveys to obtain quality data via Amazon Mechanical Turk with the associated analyses.

Suggested Citation

  • Rebecca R Carter & Analisa DiFeo & Kath Bogie & Guo-Qiang Zhang & Jiayang Sun, 2014. "Crowdsourcing Awareness: Exploration of the Ovarian Cancer Knowledge Gap through Amazon Mechanical Turk," PLOS ONE, Public Library of Science, vol. 9(1), pages 1-10, January.
  • Handle: RePEc:plo:pone00:0085508
    DOI: 10.1371/journal.pone.0085508
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0085508
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0085508&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0085508?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. John Horton & David Rand & Richard Zeckhauser, 2011. "The online laboratory: conducting experiments in a real labor market," Experimental Economics, Springer;Economic Science Association, vol. 14(3), pages 399-425, September.
    2. Jacobsen, Grant D. & Jacobsen, Kathryn H., 2011. "Health awareness campaigns and diagnosis rates: Evidence from National Breast Cancer Awareness Month," Journal of Health Economics, Elsevier, vol. 30(1), pages 55-61, January.
    3. Siddharth Suri & Duncan J Watts, 2011. "Cooperation and Contagion in Web-Based, Networked Public Goods Experiments," PLOS ONE, Public Library of Science, vol. 6(3), pages 1-18, March.
    4. Wu, Stephen, 2003. "Sickness and preventive medical behavior," Journal of Health Economics, Elsevier, vol. 22(4), pages 675-689, July.
    5. Paolacci, Gabriele & Chandler, Jesse & Ipeirotis, Panagiotis G., 2010. "Running experiments on Amazon Mechanical Turk," Judgment and Decision Making, Cambridge University Press, vol. 5(5), pages 411-419, August.
    6. Berinsky, Adam J. & Huber, Gregory A. & Lenz, Gabriel S., 2012. "Evaluating Online Labor Markets for Experimental Research: Amazon.com's Mechanical Turk," Political Analysis, Cambridge University Press, vol. 20(3), pages 351-368, July.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Wladislaw Mill & Cornelius Schneider, 2023. "The Bright Side of Tax Evasion," CESifo Working Paper Series 10615, CESifo.
    2. Nicolas Jacquemet & Alexander G James & Stéphane Luchini & James J Murphy & Jason F Shogren, 2021. "Do truth-telling oaths improve honesty in crowd-working?," PLOS ONE, Public Library of Science, vol. 16(1), pages 1-18, January.
    3. Blaine G. Robbins, 2017. "Status, identity, and ability in the formation of trust," Rationality and Society, , vol. 29(4), pages 408-448, November.
    4. Hoeft, Leonard & Kurschilgen, Michael & Mill, Wladislaw, 2025. "Norms as obligations," International Review of Law and Economics, Elsevier, vol. 81(C).
    5. Yamada, Katsunori & Sato, Masayuki, 2013. "Another avenue for anatomy of income comparisons: Evidence from hypothetical choice experiments," Journal of Economic Behavior & Organization, Elsevier, vol. 89(C), pages 35-57.
    6. Haas, Nicholas & Hassan, Mazen & Mansour, Sarah & Morton, Rebecca B., 2021. "Polarizing information and support for reform," Journal of Economic Behavior & Organization, Elsevier, vol. 185(C), pages 883-901.
    7. Cantarella, Michele & Strozzi, Chiara, 2019. "Workers in the Crowd: The Labour Market Impact of the Online Platform Economy," IZA Discussion Papers 12327, Institute of Labor Economics (IZA).
    8. Atalay, Kadir & Bakhtiar, Fayzan & Cheung, Stephen & Slonim, Robert, 2014. "Savings and prize-linked savings accounts," Journal of Economic Behavior & Organization, Elsevier, vol. 107(PA), pages 86-106.
    9. Azzam, Tarek & Harman, Elena, 2016. "Crowdsourcing for quantifying transcripts: An exploratory study," Evaluation and Program Planning, Elsevier, vol. 54(C), pages 63-73.
    10. Gandullia, Luca & Lezzi, Emanuela & Parciasepe, Paolo, 2020. "Replication with MTurk of the experimental design by Gangadharan, Grossman, Jones & Leister (2018): Charitable giving across donor types," Journal of Economic Psychology, Elsevier, vol. 78(C).
    11. Prissé, Benjamin & Jorrat, Diego, 2022. "Lab vs online experiments: No differences," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 100(C).
    12. Valerio Capraro & Hélène Barcelo, 2021. "Punishing defectors and rewarding cooperators: Do people discriminate between genders?," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 7(1), pages 19-32, September.
    13. Garbarino, Ellen & Slonim, Robert & Villeval, Marie Claire, 2019. "Loss aversion and lying behavior," Journal of Economic Behavior & Organization, Elsevier, vol. 158(C), pages 379-393.
    14. Lefgren, Lars J. & Sims, David P. & Stoddard, Olga B., 2016. "Effort, luck, and voting for redistribution," Journal of Public Economics, Elsevier, vol. 143(C), pages 89-97.
    15. Yulia Evsyukova & Felix Rusche & Wladislaw Mill, 2023. "LinkedOut? A Field Experiment on Discrimination in Job Network Formation," CRC TR 224 Discussion Paper Series crctr224_2023_482, University of Bonn and University of Mannheim, Germany.
    16. Ola Andersson & Jim Ingebretsen Carlson & Erik Wengström, 2021. "Differences Attract: An Experimental Study of Focusing in Economic Choice," The Economic Journal, Royal Economic Society, vol. 131(639), pages 2671-2692.
    17. Tim Straub & Henner Gimpel & Florian Teschner & Christof Weinhardt, 2015. "How (not) to Incent Crowd Workers," Business & Information Systems Engineering: The International Journal of WIRTSCHAFTSINFORMATIK, Springer;Gesellschaft für Informatik e.V. (GI), vol. 57(3), pages 167-179, June.
    18. Chandler, Dana & Kapelner, Adam, 2013. "Breaking monotony with meaning: Motivation in crowdsourcing markets," Journal of Economic Behavior & Organization, Elsevier, vol. 90(C), pages 123-133.
    19. Milena Tsvetkova & Michael W Macy, 2014. "The Social Contagion of Generosity," PLOS ONE, Public Library of Science, vol. 9(2), pages 1-9, February.
    20. Harman, Elena & Azzam, Tarek, 2018. "Incorporating public values into evaluative criteria: Using crowdsourcing to identify criteria and standards," Evaluation and Program Planning, Elsevier, vol. 71(C), pages 68-82.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0085508. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.