IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0134978.html
   My bibliography  Save this article

Lessons Learned from Crowdsourcing Complex Engineering Tasks

Author

Listed:
  • Matthew Staffelbach
  • Peter Sempolinski
  • Tracy Kijewski-Correa
  • Douglas Thain
  • Daniel Wei
  • Ahsan Kareem
  • Gregory Madey

Abstract

Crowdsourcing: Crowdsourcing is the practice of obtaining needed ideas, services, or content by requesting contributions from a large group of people. Amazon Mechanical Turk is a web marketplace for crowdsourcing microtasks, such as answering surveys and image tagging. We explored the limits of crowdsourcing by using Mechanical Turk for a more complicated task: analysis and creation of wind simulations. Harnessing Crowdworkers for Engineering: Our investigation examined the feasibility of using crowdsourcing for complex, highly technical tasks. This was done to determine if the benefits of crowdsourcing could be harnessed to accurately and effectively contribute to solving complex real world engineering problems. Of course, untrained crowds cannot be used as a mere substitute for trained expertise. Rather, we sought to understand how crowd workers can be used as a large pool of labor for a preliminary analysis of complex data. Virtual Wind Tunnel: We compared the skill of the anonymous crowd workers from Amazon Mechanical Turk with that of civil engineering graduate students, making a first pass at analyzing wind simulation data. For the first phase, we posted analysis questions to Amazon crowd workers and to two groups of civil engineering graduate students. A second phase of our experiment instructed crowd workers and students to create simulations on our Virtual Wind Tunnel website to solve a more complex task. Conclusions: With a sufficiently comprehensive tutorial and compensation similar to typical crowd-sourcing wages, we were able to enlist crowd workers to effectively complete longer, more complex tasks with competence comparable to that of graduate students with more comprehensive, expert-level knowledge. Furthermore, more complex tasks require increased communication with the workers. As tasks become more complex, the employment relationship begins to become more akin to outsourcing than crowdsourcing. Through this investigation, we were able to stretch and explore the limits of crowdsourcing as a tool for solving complex problems.

Suggested Citation

  • Matthew Staffelbach & Peter Sempolinski & Tracy Kijewski-Correa & Douglas Thain & Daniel Wei & Ahsan Kareem & Gregory Madey, 2015. "Lessons Learned from Crowdsourcing Complex Engineering Tasks," PLOS ONE, Public Library of Science, vol. 10(9), pages 1-19, September.
  • Handle: RePEc:plo:pone00:0134978
    DOI: 10.1371/journal.pone.0134978
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0134978
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0134978&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0134978?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Ofra Amir & David G Rand & Ya'akov Kobi Gal, 2012. "Economic Games on the Internet: The Effect of $1 Stakes," PLOS ONE, Public Library of Science, vol. 7(2), pages 1-4, February.
    2. Siddharth Suri & Duncan J Watts, 2011. "Cooperation and Contagion in Web-Based, Networked Public Goods Experiments," PLOS ONE, Public Library of Science, vol. 6(3), pages 1-18, March.
    3. Matthew J C Crump & John V McDonnell & Todd M Gureckis, 2013. "Evaluating Amazon's Mechanical Turk as a Tool for Experimental Behavioral Research," PLOS ONE, Public Library of Science, vol. 8(3), pages 1-18, March.
    4. Alexander Kawrykow & Gary Roumanis & Alfred Kam & Daniel Kwak & Clarence Leung & Chu Wu & Eleyine Zarour & Phylo players & Luis Sarmenta & Mathieu Blanchette & Jérôme Waldispühl, 2012. "Phylo: A Citizen Science Approach for Improving Multiple Sequence Alignment," PLOS ONE, Public Library of Science, vol. 7(3), pages 1-9, March.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Andrei P. Kirilenko & Travis Desell & Hany Kim & Svetlana Stepchenkova, 2017. "Crowdsourcing Analysis of Twitter Data on Climate Change: Paid Workers vs. Volunteers," Sustainability, MDPI, vol. 9(11), pages 1-15, November.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Palan, Stefan & Schitter, Christian, 2018. "Prolific.ac—A subject pool for online experiments," Journal of Behavioral and Experimental Finance, Elsevier, vol. 17(C), pages 22-27.
    2. Yamada, Katsunori & Sato, Masayuki, 2013. "Another avenue for anatomy of income comparisons: Evidence from hypothetical choice experiments," Journal of Economic Behavior & Organization, Elsevier, vol. 89(C), pages 35-57.
    3. Giorgia Ponsi & Maria Serena Panasiti & Salvatore Maria Aglioti & Marco Tullio Liuzza, 2017. "Right-wing authoritarianism and stereotype-driven expectations interact in shaping intergroup trust in one-shot vs multiple-round social interactions," PLOS ONE, Public Library of Science, vol. 12(12), pages 1-23, December.
    4. Valerio Capraro & David G. Rand, 2018. "Do the Right Thing: Experimental evidence that preferences for moral behavior, rather than equity or efficiency per se, drive human prosociality," Judgment and Decision Making, Society for Judgment and Decision Making, vol. 13(1), pages 99-111, January.
    5. Ola Andersson & Jim Ingebretsen Carlson & Erik Wengström, 2021. "Differences Attract: An Experimental Study of Focusing in Economic Choice," The Economic Journal, Royal Economic Society, vol. 131(639), pages 2671-2692.
    6. Kate Farrow & Gilles Grolleau & Lisette Ibanez, 2017. "Designing more effective norm interventions: the role of valence," Post-Print hal-01680539, HAL.
    7. Nicolas Jacquemet & Alexander G James & Stéphane Luchini & James J Murphy & Jason F Shogren, 2021. "Do truth-telling oaths improve honesty in crowd-working?," PLOS ONE, Public Library of Science, vol. 16(1), pages 1-18, January.
    8. repec:cup:judgdm:v:13:y:2018:i:1:p:99-111 is not listed on IDEAS
    9. Irene Maria Buso & Daniela Di Cagno & Sofia De Caprariis & Lorenzo Ferrari & Vittorio Larocca & Luisa Lorè & Francesca Marazzi & Luca Panaccione & Lorenzo Spadoni, 2020. "Lab-like Findings of Non-Lab Experiments: a Methodological Proposal and Validation," Working Papers CESARE 3/2020, Dipartimento di Economia e Finanza, LUISS Guido Carli.
    10. Antonio A. Arechar & Simon Gächter & Lucas Molleman, 2018. "Conducting interactive experiments online," Experimental Economics, Springer;Economic Science Association, vol. 21(1), pages 99-131, March.
    11. Irene Maria Buso & Daniela Di Cagno & Sofia De Caprariis & Lorenzo Ferrari & Vittorio Larocca & Francesca Marazzi & Luca Panaccione & Lorenzo Spadoni, 2020. "The Show Must Go On: How to Elicit Lablike Data on the Effects of COVID-19 Lockdown on Fairness and Cooperation," Working Papers CESARE 2/2020, Dipartimento di Economia e Finanza, LUISS Guido Carli.
    12. Zhongming Lu & John Crittenden & Frank Southworth & Ellen Dunham-Jones, 2017. "An integrated framework for managing the complex interdependence between infrastructures and the socioeconomic environment: An application in metropolitan Atlanta," Urban Studies, Urban Studies Journal Limited, vol. 54(12), pages 2874-2893, September.
    13. Johannes G. Jaspersen & Marc A. Ragin & Justin R. Sydnor, 2022. "Insurance demand experiments: Comparing crowdworking to the lab," Journal of Risk & Insurance, The American Risk and Insurance Association, vol. 89(4), pages 1077-1107, December.
    14. Van Borm, Hannah & Burn, Ian & Baert, Stijn, 2021. "What Does a Job Candidate's Age Signal to Employers?," Labour Economics, Elsevier, vol. 71(C).
    15. Lea Skræp Svenningsen, 2017. "Distributive outcomes matter: Measuring social preferences for climate policy," IFRO Working Paper 2017/11, University of Copenhagen, Department of Food and Resource Economics.
    16. Jillian J Jordan & David G Rand & Samuel Arbesman & James H Fowler & Nicholas A Christakis, 2013. "Contagion of Cooperation in Static and Fluid Social Networks," PLOS ONE, Public Library of Science, vol. 8(6), pages 1-10, June.
    17. Eriksen, Kristoffer W. & Fest, Sebastian & Kvaløy, Ola & Dijk, Oege, 2022. "Fair advice," Journal of Banking & Finance, Elsevier, vol. 143(C).
    18. Manapat, Michael L. & Nowak, Martin A. & Rand, David G., 2013. "Information, irrationality, and the evolution of trust," Journal of Economic Behavior & Organization, Elsevier, vol. 90(S), pages 57-75.
    19. Hyndman, Kyle & Walker, Matthew J., 2022. "Fairness and risk in ultimatum bargaining," Games and Economic Behavior, Elsevier, vol. 132(C), pages 90-105.
    20. Horváth, Gergely, 2023. "Peer effects through receiving advice in job search: An experimental study," Journal of Economic Behavior & Organization, Elsevier, vol. 216(C), pages 494-519.
    21. Jillian Jordan & Katherine McAuliffe & David Rand, 2016. "The effects of endowment size and strategy method on third party punishment," Experimental Economics, Springer;Economic Science Association, vol. 19(4), pages 741-763, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0134978. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.