IDEAS home Printed from https://ideas.repec.org/a/inm/ormsom/v23y2021i3p637-656.html
   My bibliography  Save this article

The Role of Problem Specification in Crowdsourcing Contests for Design Problems: A Theoretical and Empirical Analysis

Author

Listed:
  • Zhaohui (Zoey) Jiang

    (Stephen M. Ross School of Business, University of Michigan, Ann Arbor, Michigan 48109)

  • Yan Huang

    (Tepper School of Business, Carnegie Mellon University, Pittsburgh, Pennsylvania 15213)

  • Damian R. Beil

    (Stephen M. Ross School of Business, University of Michigan, Ann Arbor, Michigan 48109)

Abstract

Problem definition : This paper studies the role of seekers’ problem specification in crowdsourcing contests for design problems. Academic/practical relevance : Platforms hosting design contests offer detailed guidance for seekers to specify their problems when launching a contest. Yet problem specification in such crowdsourcing contests is something the theoretical and empirical literature has largely overlooked. We aim to fill this gap by offering an empirically validated model to generate insights for the provision of information at contest launch. Methodology : We develop a game-theoretic model featuring different types of information (categorized as “conceptual objectives” or “execution guidelines”) in problem specifications and assess their impact on design processes and submission qualities. Real-world data are used to empirically test hypotheses and policy recommendations generated from the model, and a quasi-natural experiment provides further empirical validation. Results : We show theoretically and verify empirically that with more conceptual objectives disclosed in the problem specification, the number of participants in a contest eventually decreases; with more execution guidelines in the problem specification, the trial effort provision by each participant increases; and the best solution quality always increases with more execution guidelines but eventually decreases with more conceptual objectives. Managerial implications : To maximize the best solution quality in crowdsourced design problems, seekers should always provide more execution guidelines and only a moderate number of conceptual objectives.

Suggested Citation

  • Zhaohui (Zoey) Jiang & Yan Huang & Damian R. Beil, 2021. "The Role of Problem Specification in Crowdsourcing Contests for Design Problems: A Theoretical and Empirical Analysis," Manufacturing & Service Operations Management, INFORMS, vol. 23(3), pages 637-656, May.
  • Handle: RePEc:inm:ormsom:v:23:y:2021:i:3:p:637-656
    DOI: 10.1287/msom.2020.0873
    as

    Download full text from publisher

    File URL: http://dx.doi.org/10.1287/msom.2020.0873
    Download Restriction: no

    File URL: https://libkey.io/10.1287/msom.2020.0873?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Sanjiv Erat & Vish Krishnan, 2012. "Managing Delegated Search Over Design Spaces," Management Science, INFORMS, vol. 58(3), pages 606-623, March.
    2. Jürgen Mihm & Jochen Schlapp, 2019. "Sourcing Innovation: On Feedback in Contests," Management Science, INFORMS, vol. 65(2), pages 559-576, February.
    3. Christian Terwiesch & Yi Xu, 2008. "Innovation Contests, Open Innovation, and Multiagent Problem Solving," Management Science, INFORMS, vol. 54(9), pages 1529-1543, September.
    4. Joel O. Wooten & Karl T. Ulrich, 2017. "Idea Generation and the Role of Feedback: Evidence from Field Experiments with Innovation Tournaments," Production and Operations Management, Production and Operations Management Society, vol. 26(1), pages 80-99, January.
    5. Kostas Bimpikis & Shayan Ehsani & Mohamed Mostagir, 2019. "Designing Dynamic Contests," Operations Research, INFORMS, vol. 67(2), pages 339-356, March.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Tat Koon Koh & Muller Y. M. Cheung, 2022. "Seeker Exemplars and Quantitative Ideation Outcomes in Crowdsourcing Contests," Information Systems Research, INFORMS, vol. 33(1), pages 265-284, March.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Nirup Menon & Anant Mishra & Shun Ye, 2020. "Beyond Related Experience: Upstream vs. Downstream Experience in Innovation Contest Platforms with Interdependent Problem Domains," Manufacturing & Service Operations Management, INFORMS, vol. 22(5), pages 1045-1065, September.
    2. Ying-Ju Chen & Tinglong Dai & C. Gizem Korpeoglu & Ersin Körpeoğlu & Ozge Sahin & Christopher S. Tang & Shihong Xiao, 2020. "OM Forum—Innovative Online Platforms: Research Opportunities," Manufacturing & Service Operations Management, INFORMS, vol. 22(3), pages 430-445, May.
    3. C. Gizem Korpeoglu & Ersin Körpeoğlu & Sıdıka Tunç, 2021. "Optimal Duration of Innovation Contests," Manufacturing & Service Operations Management, INFORMS, vol. 23(3), pages 657-675, May.
    4. Jürgen Mihm & Jochen Schlapp, 2019. "Sourcing Innovation: On Feedback in Contests," Management Science, INFORMS, vol. 65(2), pages 559-576, February.
    5. Joel O. Wooten, 2022. "Leaps in innovation and the Bannister effect in contests," Production and Operations Management, Production and Operations Management Society, vol. 31(6), pages 2646-2663, June.
    6. Zhaohui (Zoey) Jiang & Yan Huang & Damian R. Beil, 2022. "The Role of Feedback in Dynamic Crowdsourcing Contests: A Structural Empirical Analysis," Management Science, INFORMS, vol. 68(7), pages 4858-4877, July.
    7. Laurence Ales & Soo‐Haeng Cho & Ersin Körpeoğlu, 2021. "Innovation Tournaments with Multiple Contributors," Production and Operations Management, Production and Operations Management Society, vol. 30(6), pages 1772-1784, June.
    8. Stylianos Kavadias & Karl T. Ulrich, 2020. "Innovation and New Product Development: Reflections and Insights from the Research Published in the First 20 Years of Manufacturing & Service Operations Management," Manufacturing & Service Operations Management, INFORMS, vol. 22(1), pages 84-92, January.
    9. Cheng, Xi & Gou, Qinglong & Yue, Jinfeng & Zhang, Yan, 2019. "Equilibrium decisions for an innovation crowdsourcing platform," Transportation Research Part E: Logistics and Transportation Review, Elsevier, vol. 125(C), pages 241-260.
    10. Tat Koon Koh & Muller Y. M. Cheung, 2022. "Seeker Exemplars and Quantitative Ideation Outcomes in Crowdsourcing Contests," Information Systems Research, INFORMS, vol. 33(1), pages 265-284, March.
    11. Pallab Sanyal & Shun Ye, 2024. "An Examination of the Dynamics of Crowdsourcing Contests: Role of Feedback Type," Information Systems Research, INFORMS, vol. 35(1), pages 394-413, March.
    12. Swanand J. Deodhar & Samrat Gupta, 2023. "The Impact of Social Reputation Features in Innovation Tournaments: Evidence from a Natural Experiment," Information Systems Research, INFORMS, vol. 34(1), pages 178-193, March.
    13. Lakshminarayana Nittala & Sanjiv Erat & Vish Krishnan, 2022. "Designing internal innovation contests," Production and Operations Management, Production and Operations Management Society, vol. 31(5), pages 1963-1976, May.
    14. Jesse Bockstedt & Cheryl Druehl & Anant Mishra, 2022. "Incentives and Stars: Competition in Innovation Contests with Participant and Submission Visibility," Production and Operations Management, Production and Operations Management Society, vol. 31(3), pages 1372-1393, March.
    15. Salgado, Stéphane & Hemonnet-Goujot, Aurelie & Henard, David H. & de Barnier, Virginie, 2020. "The dynamics of innovation contest experience: An integrated framework from the customer’s perspective," Journal of Business Research, Elsevier, vol. 117(C), pages 29-43.
    16. Daniel P. Gross, 2020. "Creativity Under Fire: The Effects of Competition on Creative Production," The Review of Economics and Statistics, MIT Press, vol. 102(3), pages 583-599, July.
    17. Pin Gao & Xiaoshuai Fan & Yangguang Huang & Ying-Ju Chen, 2022. "Resource Allocation Among Competing Innovators," Management Science, INFORMS, vol. 68(8), pages 6059-6074, August.
    18. Pollok, Patrick & Lüttgens, Dirk & Piller, Frank T., 2019. "Attracting solutions in crowdsourcing contests: The role of knowledge distance, identity disclosure, and seeker status," Research Policy, Elsevier, vol. 48(1), pages 98-114.
    19. Bavly, Gilad & Heller, Yuval & Schreiber, Amnon, 2022. "Social welfare in search games with asymmetric information," Journal of Economic Theory, Elsevier, vol. 202(C).
    20. Yuan Jin & Ho Cheung Brian Lee & Sulin Ba & Jan Stallaert, 2021. "Winning by Learning? Effect of Knowledge Sharing in Crowdsourcing Contests," Information Systems Research, INFORMS, vol. 32(3), pages 836-859, September.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:inm:ormsom:v:23:y:2021:i:3:p:637-656. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Chris Asher (email available below). General contact details of provider: https://edirc.repec.org/data/inforea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.