IDEAS home Printed from https://ideas.repec.org/a/inm/orisre/v33y2022i1p265-284.html
   My bibliography  Save this article

Seeker Exemplars and Quantitative Ideation Outcomes in Crowdsourcing Contests

Author

Listed:
  • Tat Koon Koh

    (School of Business and Management, Hong Kong University of Science and Technology, Clear Water Bay, Kowloon, Hong Kong)

  • Muller Y. M. Cheung

    (School of Business and Management, Hong Kong University of Science and Technology, Clear Water Bay, Kowloon, Hong Kong)

Abstract

Idea seekers in crowdsourcing ideation contests often provide solution exemplars to guide solvers in developing ideas. Solvers can also use these exemplars to infer seekers’ preferences when generating ideas. In this study, we delve into solvers’ ideation process and examine how seeker exemplars affect the quantitative outcomes in solvers’ scanning, shortlisting, and selection of ideas; these ideation activities relate to the search and evaluate stage of a previously published knowledge reuse for innovation model. We theorize that solvers’ use of local (problem-related) and/or distant (problem-unrelated) seeker exemplars in the respective search and evaluation activities is affected by their belief and emphasis in contests as well as the influences of processing fluency and confirmation bias during idea generation. Consequently, local and distant seeker exemplars have different effects in different ideation activities. Consistent with our theorizing, the results from an ideation contest experiment show that, compared with not showing any seeker exemplars, providing these exemplars either does not affect or could even hurt the quantitative outcomes in the respective ideation activities. We find that solvers generally search for, shortlist, and/or submit fewer ideas when shown certain seeker exemplars. Moreover, solvers who submit fewer ideas tend to submit lower quality ideas on average. Thus, showing seeker exemplars, which contest platforms encourage and seekers often do, could negatively affect quantitative ideation outcomes and thereby impair idea quality. We discuss the theoretical and practical implications of this research.

Suggested Citation

  • Tat Koon Koh & Muller Y. M. Cheung, 2022. "Seeker Exemplars and Quantitative Ideation Outcomes in Crowdsourcing Contests," Information Systems Research, INFORMS, vol. 33(1), pages 265-284, March.
  • Handle: RePEc:inm:orisre:v:33:y:2022:i:1:p:265-284
    DOI: 10.1287/isre.2021.1054
    as

    Download full text from publisher

    File URL: http://dx.doi.org/10.1287/isre.2021.1054
    Download Restriction: no

    File URL: https://libkey.io/10.1287/isre.2021.1054?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. John Horton & David Rand & Richard Zeckhauser, 2011. "The online laboratory: conducting experiments in a real labor market," Experimental Economics, Springer;Economic Science Association, vol. 14(3), pages 399-425, September.
    2. Dan Li & Longying Hu, 2017. "Exploring the effects of reward and competition intensity on participation in crowdsourcing contests," Electronic Markets, Springer;IIM University of St. Gallen, vol. 27(3), pages 199-210, August.
    3. repec:cup:judgdm:v:5:y:2010:i:5:p:411-419 is not listed on IDEAS
    4. Juncai Jiang & Yu Wang, 2020. "A Theoretical and Empirical Investigation of Feedback in Ideation Contests," Production and Operations Management, Production and Operations Management Society, vol. 29(2), pages 481-500, February.
    5. C. Page Moreau & Darren W. Dahl, 2005. "Designing the Solution: The Impact of Constraints on Consumers' Creativity," Journal of Consumer Research, Journal of Consumer Research Inc., vol. 32(1), pages 13-22, June.
    6. Daniel Kahneman, 2003. "Maps of Bounded Rationality: Psychology for Behavioral Economics," American Economic Review, American Economic Association, vol. 93(5), pages 1449-1475, December.
    7. Zhaohui (Zoey) Jiang & Yan Huang & Damian R. Beil, 2021. "The Role of Problem Specification in Crowdsourcing Contests for Design Problems: A Theoretical and Empirical Analysis," Manufacturing & Service Operations Management, INFORMS, vol. 23(3), pages 637-656, May.
    8. Joel O. Wooten & Karl T. Ulrich, 2017. "Idea Generation and the Role of Feedback: Evidence from Field Experiments with Innovation Tournaments," Production and Operations Management, Production and Operations Management Society, vol. 26(1), pages 80-99, January.
    9. Elina H. Hwang & Param Vir Singh & Linda Argote, 2019. "Jack of All, Master of Some: Information Network and Innovation in Crowdsourcing Communities," Information Systems Research, INFORMS, vol. 30(2), pages 389-410, June.
    10. Barry L. Bayus, 2013. "Crowdsourcing New Product Ideas over Time: An Analysis of the Dell IdeaStorm Community," Management Science, INFORMS, vol. 59(1), pages 226-244, June.
    11. Karan Girotra & Christian Terwiesch & Karl T. Ulrich, 2010. "Idea Generation and the Quality of the Best Idea," Management Science, INFORMS, vol. 56(4), pages 591-605, April.
    12. Nikolaus Franke & Marion K. Poetz & Martin Schreier, 2014. "Integrating Problem Solvers from Analogous Markets in New Product Ideation," Management Science, INFORMS, vol. 60(4), pages 1063-1081, April.
    13. Jürgen Mihm & Jochen Schlapp, 2019. "Sourcing Innovation: On Feedback in Contests," Management Science, INFORMS, vol. 65(2), pages 559-576, February.
    14. Christian Terwiesch & Yi Xu, 2008. "Innovation Contests, Open Innovation, and Multiagent Problem Solving," Management Science, INFORMS, vol. 54(9), pages 1529-1543, September.
    15. Ivo Blohm & Christoph Riedl & Johann Füller & Jan Marco Leimeister, 2016. "Rate or Trade? Identifying Winning Ideas in Open Idea Sourcing," Information Systems Research, INFORMS, vol. 27(1), pages 27-48, March.
    16. Tat Koon Koh, 2019. "Adopting Seekers’ Solution Exemplars in Crowdsourcing Ideation Contests: Antecedents and Consequences," Information Systems Research, INFORMS, vol. 30(2), pages 486-506, June.
    17. Lars Bo Jeppesen & Karim R. Lakhani, 2010. "Marginality and Problem-Solving Effectiveness in Broadcast Search," Organization Science, INFORMS, vol. 21(5), pages 1016-1033, October.
    18. Christoph Riedl & Victor P. Seidel, 2018. "Learning from Mixed Signals in Online Innovation Communities," Organization Science, INFORMS, vol. 29(6), pages 1010-1032, December.
    19. Pollok, Patrick & Lüttgens, Dirk & Piller, Frank T., 2019. "Attracting solutions in crowdsourcing contests: The role of knowledge distance, identity disclosure, and seeker status," Research Policy, Elsevier, vol. 48(1), pages 98-114.
    20. Ann Majchrzak & Lynne P. Cooper & Olivia E. Neece, 2004. "Knowledge Reuse for Innovation," Management Science, INFORMS, vol. 50(2), pages 174-188, February.
    21. Schemmann, Brita & Herrmann, Andrea M. & Chappin, Maryse M.H. & Heimeriks, Gaston J., 2016. "Crowdsourcing ideas: Involving ordinary users in the ideation phase of new product development," Research Policy, Elsevier, vol. 45(6), pages 1145-1154.
    22. Nirup Menon & Anant Mishra & Shun Ye, 2020. "Beyond Related Experience: Upstream vs. Downstream Experience in Innovation Contest Platforms with Interdependent Problem Domains," Manufacturing & Service Operations Management, INFORMS, vol. 22(5), pages 1045-1065, September.
    23. Ho Cheung Brian Lee & Sulin Ba & Xinxin Li & Jan Stallaert, 2018. "Salience Bias in Crowdsourcing Contests," Information Systems Research, INFORMS, vol. 29(2), pages 401-418, June.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Bonazzi, Riccardo & Viscusi, Gianluigi & Solidoro, Adriano, 2024. "Crowd mining as a strategic resource for innovation seekers," Technovation, Elsevier, vol. 132(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Pallab Sanyal & Shun Ye, 2024. "An Examination of the Dynamics of Crowdsourcing Contests: Role of Feedback Type," Information Systems Research, INFORMS, vol. 35(1), pages 394-413, March.
    2. Swanand J. Deodhar & Samrat Gupta, 2023. "The Impact of Social Reputation Features in Innovation Tournaments: Evidence from a Natural Experiment," Information Systems Research, INFORMS, vol. 34(1), pages 178-193, March.
    3. Dahlander, Linus & Beretta, Michela & Thomas, Arne & Kazemi, Shahab & Fenger, Morten H.J. & Frederiksen, Lars, 2023. "Weeding out or picking winners in open innovation? Factors driving multi-stage crowd selection on LEGO ideas," Research Policy, Elsevier, vol. 52(10).
    4. Patel, Chirag & Ahmad Husairi, Mariyani & Haon, Christophe & Oberoi, Poonam, 2023. "Monetary rewards and self-selection in design crowdsourcing contests: Managing participation, contribution appropriateness, and winning trade-offs," Technological Forecasting and Social Change, Elsevier, vol. 191(C).
    5. Jiao, Yuanyuan & Wu, Yepeng & Lu, Steven, 2021. "The role of crowdsourcing in product design: The moderating effect of user expertise and network connectivity," Technology in Society, Elsevier, vol. 64(C).
    6. Jesse Bockstedt & Cheryl Druehl & Anant Mishra, 2022. "Incentives and Stars: Competition in Innovation Contests with Participant and Submission Visibility," Production and Operations Management, Production and Operations Management Society, vol. 31(3), pages 1372-1393, March.
    7. Yuan Jin & Ho Cheung Brian Lee & Sulin Ba & Jan Stallaert, 2021. "Winning by Learning? Effect of Knowledge Sharing in Crowdsourcing Contests," Information Systems Research, INFORMS, vol. 32(3), pages 836-859, September.
    8. Yang, Xi & Zhao, Quanwu & Sun, Heshan, 2022. "Seekers’ complaint behavior in crowdsourcing: An uncertainty perspective," Journal of Retailing and Consumer Services, Elsevier, vol. 68(C).
    9. Gillier, Thomas & Chaffois, Cédric & Belkhouja, Mustapha & Roth, Yannig & Bayus, Barry L., 2018. "The effects of task instructions in crowdsourcing innovative ideas," Technological Forecasting and Social Change, Elsevier, vol. 134(C), pages 35-44.
    10. Nirup Menon & Anant Mishra & Shun Ye, 2020. "Beyond Related Experience: Upstream vs. Downstream Experience in Innovation Contest Platforms with Interdependent Problem Domains," Manufacturing & Service Operations Management, INFORMS, vol. 22(5), pages 1045-1065, September.
    11. Steils, Nadia & Hanine, Salwa, 2019. "Recruiting valuable participants in online IDEA generation: The role of brief instructions," Journal of Business Research, Elsevier, vol. 96(C), pages 14-25.
    12. Yang, Mu & Han, Chunjia, 2021. "Stimulating innovation: Managing peer interaction for idea generation on digital innovation platforms," Journal of Business Research, Elsevier, vol. 125(C), pages 456-465.
    13. Tat Koon Koh, 2019. "Adopting Seekers’ Solution Exemplars in Crowdsourcing Ideation Contests: Antecedents and Consequences," Information Systems Research, INFORMS, vol. 30(2), pages 486-506, June.
    14. Moghaddam, Ehsan Noorzad & Aliahmadi, Alireza & Bagherzadeh, Mehdi & Markovic, Stefan & Micevski, Milena & Saghafi, Fatemeh, 2023. "Let me choose what I want: The influence of incentive choice flexibility on the quality of crowdsourcing solutions to innovation problems," Technovation, Elsevier, vol. 120(C).
    15. Joel O. Wooten, 2022. "Leaps in innovation and the Bannister effect in contests," Production and Operations Management, Production and Operations Management Society, vol. 31(6), pages 2646-2663, June.
    16. Ho Cheung Brian Lee & Sulin Ba & Xinxin Li & Jan Stallaert, 2018. "Salience Bias in Crowdsourcing Contests," Information Systems Research, INFORMS, vol. 29(2), pages 401-418, June.
    17. Lakshminarayana Nittala & Sanjiv Erat & Vish Krishnan, 2022. "Designing internal innovation contests," Production and Operations Management, Production and Operations Management Society, vol. 31(5), pages 1963-1976, May.
    18. Niek Althuizen & Bo Chen, 2022. "Crowdsourcing Ideas Using Product Prototypes: The Joint Effect of Prototype Enhancement and the Product Design Goal on Idea Novelty," Management Science, INFORMS, vol. 68(4), pages 3008-3025, April.
    19. Pollok, Patrick & Lüttgens, Dirk & Piller, Frank T., 2019. "Attracting solutions in crowdsourcing contests: The role of knowledge distance, identity disclosure, and seeker status," Research Policy, Elsevier, vol. 48(1), pages 98-114.
    20. Yan Huang & Param Vir Singh & Kannan Srinivasan, 2014. "Crowdsourcing New Product Ideas Under Consumer Learning," Management Science, INFORMS, vol. 60(9), pages 2138-2159, September.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:inm:orisre:v:33:y:2022:i:1:p:265-284. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Chris Asher (email available below). General contact details of provider: https://edirc.repec.org/data/inforea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.