IDEAS home Printed from https://ideas.repec.org/a/inm/orisre/v35y2024i1p394-413.html
   My bibliography  Save this article

An Examination of the Dynamics of Crowdsourcing Contests: Role of Feedback Type

Author

Listed:
  • Pallab Sanyal

    (School of Business, George Mason University, Fairfax, Virginia 22030)

  • Shun Ye

    (School of Business, George Mason University, Fairfax, Virginia 22030)

Abstract

As more businesses are turning to crowdsourcing platforms for solutions to business problems, determining how to manage the sourcing contests based on their objectives has become critically important. Existing research, both theoretical and empirical, studies the impact of a variety of contest and contestant characteristics on the outcomes of these contests. Aside from these static design parameters, a lever organizations (clients) can use to dynamically steer contests toward desirable goals is the feedback offered to the contestants (solvers) during the contest. Although a handful of recent studies focuses on the effects of feedback at a high level (e.g., volume, valence), to the best of our knowledge, none has examined the effects of the information contained in the feedback. Furthermore, the focus of the existing studies is solely on the quality of the submissions and not on other critical contest outcomes, such as the diversity of the submissions, which is found to be significant in the creativity and innovations literature. In this study, first, using the psychology literature on the theory of feedback intervention, we classify client feedback into two types: outcome and process. Second, using data from almost 12,000 design contests, we empirically examine the effects of the two types of feedback on the convergence and diversity of submissions following feedback interventions. We find that process feedback, providing goal-oriented information to solvers, fosters convergent thinking, leading to submissions that are similar. Although outcome feedback lacks the informative value of process feedback, it encourages divergent thinking, which is the ability to produce a variety of solutions to a problem. Furthermore, we find that the effects are strengthened when the feedback is provided earlier in the contest rather than later. Based on our findings, we offer insights on how practitioners can strategically use an appropriate form of feedback to either generate greater diversity of solutions or efficient convergence to an acceptable solution.

Suggested Citation

  • Pallab Sanyal & Shun Ye, 2024. "An Examination of the Dynamics of Crowdsourcing Contests: Role of Feedback Type," Information Systems Research, INFORMS, vol. 35(1), pages 394-413, March.
  • Handle: RePEc:inm:orisre:v:35:y:2024:i:1:p:394-413
    DOI: 10.1287/isre.2023.1232
    as

    Download full text from publisher

    File URL: http://dx.doi.org/10.1287/isre.2023.1232
    Download Restriction: no

    File URL: https://libkey.io/10.1287/isre.2023.1232?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Paul Benjamin Lowry & Jun Zhang & Chuang Wang & Mikko Siponen, 2016. "Why Do Adults Engage in Cyberbullying on Social Media? An Integration of Online Disinhibition and Deindividuation Effects with the Social Structure and Social Learning Model," Information Systems Research, INFORMS, vol. 27(4), pages 962-986, December.
    2. Jürgen Mihm & Jochen Schlapp, 2019. "Sourcing Innovation: On Feedback in Contests," Management Science, INFORMS, vol. 65(2), pages 559-576, February.
    3. Christian Terwiesch & Yi Xu, 2008. "Innovation Contests, Open Innovation, and Multiagent Problem Solving," Management Science, INFORMS, vol. 54(9), pages 1529-1543, September.
    4. Yuan Jin & Ho Cheung Brian Lee & Sulin Ba & Jan Stallaert, 2021. "Winning by Learning? Effect of Knowledge Sharing in Crowdsourcing Contests," Information Systems Research, INFORMS, vol. 32(3), pages 836-859, September.
    5. John D. Sterman, 1989. "Modeling Managerial Behavior: Misperceptions of Feedback in a Dynamic Decision Making Experiment," Management Science, INFORMS, vol. 35(3), pages 321-339, March.
    6. Yan Huang & Param Vir Singh & Kannan Srinivasan, 2014. "Crowdsourcing New Product Ideas Under Consumer Learning," Management Science, INFORMS, vol. 60(9), pages 2138-2159, September.
    7. Kevin J. Boudreau & Nicola Lacetera & Karim R. Lakhani, 2011. "Incentives and Problem Uncertainty in Innovation Contests: An Empirical Analysis," Management Science, INFORMS, vol. 57(5), pages 843-863, May.
    8. Kishore Sengupta & Tarek K. Abdel-Hamid, 1993. "Alternative Conceptions of Feedback in Dynamic Decision Environments: An Experimental Investigation," Management Science, INFORMS, vol. 39(4), pages 411-428, April.
    9. Juncai Jiang & Yu Wang, 2020. "A Theoretical and Empirical Investigation of Feedback in Ideation Contests," Production and Operations Management, Production and Operations Management Society, vol. 29(2), pages 481-500, February.
    10. Anindya Ghose & Sang Pil Han, 2014. "Estimating Demand for Mobile Applications in the New Economy," Management Science, INFORMS, vol. 60(6), pages 1470-1488, June.
    11. Tracy Xiao Liu & Jiang Yang & Lada A. Adamic & Yan Chen, 2014. "Crowdsourcing with All-Pay Auctions: A Field Experiment on Taskcn," Management Science, INFORMS, vol. 60(8), pages 2020-2037, August.
    12. Kimmy Wa Chan & Stella Yiyan Li & Jian Ni & John JianJun Zhu, 2021. "What Feedback Matters? The Role of Experience in Motivating Crowdsourcing Innovation," Production and Operations Management, Production and Operations Management Society, vol. 30(1), pages 103-126, January.
    13. Joel O. Wooten & Karl T. Ulrich, 2017. "Idea Generation and the Role of Feedback: Evidence from Field Experiments with Innovation Tournaments," Production and Operations Management, Production and Operations Management Society, vol. 26(1), pages 80-99, January.
    14. Elina H. Hwang & Param Vir Singh & Linda Argote, 2019. "Jack of All, Master of Some: Information Network and Innovation in Crowdsourcing Communities," Information Systems Research, INFORMS, vol. 30(2), pages 389-410, June.
    15. Quan Wang & Beibei Li & Param Vir Singh, 2018. "Copycats vs. Original Mobile Apps: A Machine Learning Copycat-Detection Method and Empirical Analysis," Information Systems Research, INFORMS, vol. 29(2), pages 273-291, June.
    16. Barry L. Bayus, 2013. "Crowdsourcing New Product Ideas over Time: An Analysis of the Dell IdeaStorm Community," Management Science, INFORMS, vol. 59(1), pages 226-244, June.
    17. Karan Girotra & Christian Terwiesch & Karl T. Ulrich, 2010. "Idea Generation and the Quality of the Best Idea," Management Science, INFORMS, vol. 56(4), pages 591-605, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Tat Koon Koh & Muller Y. M. Cheung, 2022. "Seeker Exemplars and Quantitative Ideation Outcomes in Crowdsourcing Contests," Information Systems Research, INFORMS, vol. 33(1), pages 265-284, March.
    2. Swanand J. Deodhar & Samrat Gupta, 2023. "The Impact of Social Reputation Features in Innovation Tournaments: Evidence from a Natural Experiment," Information Systems Research, INFORMS, vol. 34(1), pages 178-193, March.
    3. Nirup Menon & Anant Mishra & Shun Ye, 2020. "Beyond Related Experience: Upstream vs. Downstream Experience in Innovation Contest Platforms with Interdependent Problem Domains," Manufacturing & Service Operations Management, INFORMS, vol. 22(5), pages 1045-1065, September.
    4. Ho Cheung Brian Lee & Sulin Ba & Xinxin Li & Jan Stallaert, 2018. "Salience Bias in Crowdsourcing Contests," Information Systems Research, INFORMS, vol. 29(2), pages 401-418, June.
    5. Jiao, Yuanyuan & Wu, Yepeng & Lu, Steven, 2021. "The role of crowdsourcing in product design: The moderating effect of user expertise and network connectivity," Technology in Society, Elsevier, vol. 64(C).
    6. Lakshminarayana Nittala & Sanjiv Erat & Vish Krishnan, 2022. "Designing internal innovation contests," Production and Operations Management, Production and Operations Management Society, vol. 31(5), pages 1963-1976, May.
    7. Yuan Jin & Ho Cheung Brian Lee & Sulin Ba & Jan Stallaert, 2021. "Winning by Learning? Effect of Knowledge Sharing in Crowdsourcing Contests," Information Systems Research, INFORMS, vol. 32(3), pages 836-859, September.
    8. Hu, Feng & Bijmolt, Tammo H.A. & Huizingh, Eelko K.R.E., 2020. "The impact of innovation contest briefs on the quality of solvers and solutions," Technovation, Elsevier, vol. 90.
    9. Cheng, Xi & Gou, Qinglong & Yue, Jinfeng & Zhang, Yan, 2019. "Equilibrium decisions for an innovation crowdsourcing platform," Transportation Research Part E: Logistics and Transportation Review, Elsevier, vol. 125(C), pages 241-260.
    10. Dargahi, Rambod & Namin, Aidin & Ketron, Seth C. & Saint Clair, Julian K., 2021. "Is self-knowledge the ultimate prize? A quantitative analysis of participation choice in online ideation crowdsourcing contests," Journal of Retailing and Consumer Services, Elsevier, vol. 62(C).
    11. Patel, Chirag & Ahmad Husairi, Mariyani & Haon, Christophe & Oberoi, Poonam, 2023. "Monetary rewards and self-selection in design crowdsourcing contests: Managing participation, contribution appropriateness, and winning trade-offs," Technological Forecasting and Social Change, Elsevier, vol. 191(C).
    12. Zhuojun Gu & Ravi Bapna & Jason Chan & Alok Gupta, 2022. "Measuring the Impact of Crowdsourcing Features on Mobile App User Engagement and Retention: A Randomized Field Experiment," Management Science, INFORMS, vol. 68(2), pages 1297-1329, February.
    13. Joel O. Wooten, 2022. "Leaps in innovation and the Bannister effect in contests," Production and Operations Management, Production and Operations Management Society, vol. 31(6), pages 2646-2663, June.
    14. Jesse Bockstedt & Cheryl Druehl & Anant Mishra, 2022. "Incentives and Stars: Competition in Innovation Contests with Participant and Submission Visibility," Production and Operations Management, Production and Operations Management Society, vol. 31(3), pages 1372-1393, March.
    15. Juncai Jiang & Yu Wang, 2020. "A Theoretical and Empirical Investigation of Feedback in Ideation Contests," Production and Operations Management, Production and Operations Management Society, vol. 29(2), pages 481-500, February.
    16. Salgado, Stéphane & Hemonnet-Goujot, Aurelie & Henard, David H. & de Barnier, Virginie, 2020. "The dynamics of innovation contest experience: An integrated framework from the customer’s perspective," Journal of Business Research, Elsevier, vol. 117(C), pages 29-43.
    17. Niek Althuizen & Bo Chen, 2022. "Crowdsourcing Ideas Using Product Prototypes: The Joint Effect of Prototype Enhancement and the Product Design Goal on Idea Novelty," Management Science, INFORMS, vol. 68(4), pages 3008-3025, April.
    18. Pollok, Patrick & Lüttgens, Dirk & Piller, Frank T., 2019. "Attracting solutions in crowdsourcing contests: The role of knowledge distance, identity disclosure, and seeker status," Research Policy, Elsevier, vol. 48(1), pages 98-114.
    19. Yan Huang & Param Vir Singh & Kannan Srinivasan, 2014. "Crowdsourcing New Product Ideas Under Consumer Learning," Management Science, INFORMS, vol. 60(9), pages 2138-2159, September.
    20. repec:eee:respol:v:48:y:2019:i:8:p:- is not listed on IDEAS
    21. Laura J. Kornish & Jeremy Hutchison‐Krupat, 2017. "Research on Idea Generation and Selection: Implications for Management of Technology," Production and Operations Management, Production and Operations Management Society, vol. 26(4), pages 633-651, April.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:inm:orisre:v:35:y:2024:i:1:p:394-413. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Chris Asher (email available below). General contact details of provider: https://edirc.repec.org/data/inforea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.