IDEAS home Printed from https://ideas.repec.org/a/inm/ormnsc/v68y2022i6p4478-4495.html
   My bibliography  Save this article

Conservatism Gets Funded? A Field Experiment on the Role of Negative Information in Novel Project Evaluation

Author

Listed:
  • Jacqueline N. Lane

    (Harvard Business School, Boston, Massachusetts 02163)

  • Misha Teplitskiy

    (Laboratory for Innovation Science at Harvard, Boston, Massachusetts 02134; University of Michigan School of Information, Ann Arbor, Michigan 48109)

  • Gary Gray

    (Harvard Medical School, Boston, Massachusetts 02115)

  • Hardeep Ranu

    (Harvard Medical School, Boston, Massachusetts 02115)

  • Michael Menietti

    (Harvard Business School, Boston, Massachusetts 02163; Laboratory for Innovation Science at Harvard, Boston, Massachusetts 02134)

  • Eva C. Guinan

    (Laboratory for Innovation Science at Harvard, Boston, Massachusetts 02134; Harvard Medical School, Boston, Massachusetts 02115; Dana-Farber Cancer Institute, Boston, Massachusetts 02215)

  • Karim R. Lakhani

    (Harvard Business School, Boston, Massachusetts 02163; Laboratory for Innovation Science at Harvard, Boston, Massachusetts 02134)

Abstract

The evaluation and selection of novel projects lies at the heart of scientific and technological innovation, and yet there are persistent concerns about bias, such as conservatism. This paper investigates the role that the format of evaluation, specifically information sharing among expert evaluators, plays in generating conservative decisions. We executed two field experiments in two separate grant-funding opportunities at a leading research university, mobilizing 369 evaluators from seven universities to evaluate 97 projects, resulting in 761 proposal-evaluation pairs and more than $250,000 in awards. We exogenously varied the relative valence (positive and negative) of others’ scores and measured how exposures to higher and lower scores affect the focal evaluator’s propensity to change their initial score. We found causal evidence of a negativity bias, where evaluators lower their scores by more points after seeing scores more critical than their own rather than raise them after seeing more favorable scores. Qualitative coding of the evaluators’ justifications for score changes reveals that exposures to lower scores were associated with greater attention to uncovering weaknesses, whereas exposures to neutral or higher scores were associated with increased emphasis on nonevaluation criteria, such as confidence in one’s judgment. The greater power of negative information suggests that information sharing among expert evaluators can lead to more conservative allocation decisions that favor protecting against failure rather than maximizing success.

Suggested Citation

  • Jacqueline N. Lane & Misha Teplitskiy & Gary Gray & Hardeep Ranu & Michael Menietti & Eva C. Guinan & Karim R. Lakhani, 2022. "Conservatism Gets Funded? A Field Experiment on the Role of Negative Information in Novel Project Evaluation," Management Science, INFORMS, vol. 68(6), pages 4478-4495, June.
  • Handle: RePEc:inm:ormnsc:v:68:y:2022:i:6:p:4478-4495
    DOI: 10.1287/mnsc.2021.4107
    as

    Download full text from publisher

    File URL: http://dx.doi.org/10.1287/mnsc.2021.4107
    Download Restriction: no

    File URL: https://libkey.io/10.1287/mnsc.2021.4107?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Gilat Levy, 2007. "Decision Making in Committees: Transparency, Reputation, and Voting Rules," American Economic Review, American Economic Association, vol. 97(1), pages 150-168, March.
    2. Thomas Åstebro & Samir Elhedhli, 2006. "The Effectiveness of Simple Decision Heuristics: Forecasting Commercial Success for Early-Stage Ventures," Management Science, INFORMS, vol. 52(3), pages 395-409, March.
    3. Thomas Heinze, 2008. "How to sponsor ground-breaking research: A comparison of funding schemes," Science and Public Policy, Oxford University Press, vol. 35(5), pages 302-318, June.
    4. Boudreau, Kevin J. & Lakhani, Karim R., 2015. "“Open” disclosure of innovations, incentives and follow-on reuse: Theory on processes of cumulative innovation and a field experiment in computational biology," Research Policy, Elsevier, vol. 44(1), pages 4-19.
    5. Stephen A Gallo & Joanne H Sullivan & Scott R Glisson, 2016. "The Influence of Peer Reviewer Expertise on the Evaluation of Research Funding Applications," PLOS ONE, Public Library of Science, vol. 11(10), pages 1-18, October.
    6. Albert E. Mannes, 2009. "Are We Wise About the Wisdom of Crowds? The Use of Group Judgments in Belief Revision," Management Science, INFORMS, vol. 55(8), pages 1267-1279, August.
    7. Virginia Gewin, 2012. "Risky research: The sky's the limit," Nature, Nature, vol. 487(7407), pages 395-397, July.
    8. Felipe A. Csaszar & J. P. Eggers, 2013. "Organizational Decision Making: An Information Aggregation View," Management Science, INFORMS, vol. 59(10), pages 2257-2277, October.
    9. Ernst Fehr & Michael Naef & Klaus M. Schmidt, 2006. "Inequality Aversion, Efficiency, and Maximin Preferences in Simple Distribution Experiments: Comment," American Economic Review, American Economic Association, vol. 96(5), pages 1912-1917, December.
    10. Gerardo A. Okhuysen & Kathleen M. Eisenhardt, 2002. "Integrating Knowledge in Groups: How Formal Interventions Enable Flexibility," Organization Science, INFORMS, vol. 13(4), pages 370-386, August.
    11. Pierre Azoulay & Danielle Li, 2020. "Scientific Grant Funding," NBER Chapters, in: Innovation and Public Policy, pages 117-150, National Bureau of Economic Research, Inc.
    12. Alberto Galasso & Timothy S. Simcoe, 2011. "CEO Overconfidence and Innovation," Management Science, INFORMS, vol. 57(8), pages 1469-1484, August.
    13. Amanda J. Sharkey & Balázs Kovács, 2018. "The Many Gifts of Status: How Attending to Audience Reactions Drives the Use of Status," Management Science, INFORMS, vol. 64(11), pages 5422-5443, November.
    14. Ethan Mollick & Ramana Nanda, 2016. "Wisdom or Madness? Comparing Crowds with Expert Evaluation in Funding the Arts," Management Science, INFORMS, vol. 62(6), pages 1533-1553, June.
    15. Jonathon N. Cummings, 2004. "Work Groups, Structural Diversity, and Knowledge Sharing in a Global Organization," Management Science, INFORMS, vol. 50(3), pages 352-364, March.
    16. David Hirshleifer & Angie Low & Siew Hong Teoh, 2012. "Are Overconfident CEOs Better Innovators?," Journal of Finance, American Finance Association, vol. 67(4), pages 1457-1498, August.
    17. Melissa C. Thomas-Hunt & Tonya Y. Ogden & Margaret A. Neale, 2003. "Who's Really Sharing? Effects of Social and Expert Status on Knowledge Exchange Within Groups," Management Science, INFORMS, vol. 49(4), pages 464-477, April.
    18. Erin L. Scott & Pian Shu & Roman M. Lubynsky, 2020. "Entrepreneurial Uncertainty and Expert Evaluation: An Empirical Analysis," Management Science, INFORMS, vol. 66(3), pages 1278-1299, March.
    19. Caroline S. Wagner & Jeffrey Alexander, 2013. "Evaluating transformative research programmes: A case study of the NSF Small Grants for Exploratory Research programme," Research Evaluation, Oxford University Press, vol. 22(3), pages 187-197, June.
    20. O'Reilly, Charles A., III & Tushman, Michael L., 2013. "Organizational Ambidexterity: Past, Present and Future," Research Papers 2130, Stanford University, Graduate School of Business.
    21. Kevin J. Boudreau & Eva C. Guinan & Karim R. Lakhani & Christoph Riedl, 2016. "Looking Across and Looking Beyond the Knowledge Frontier: Intellectual Distance, Novelty, and Resource Allocation in Science," Management Science, INFORMS, vol. 62(10), pages 2765-2783, October.
    22. Corinne Bendersky & Nicholas A. Hays, 2012. "Status Conflict in Groups," Organization Science, INFORMS, vol. 23(2), pages 323-340, April.
    23. Pierre Azoulay & Danielle Li, 2020. "Scientific Grant Funding," NBER Working Papers 26889, National Bureau of Economic Research, Inc.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Giulio Giacomo Cantone, 2024. "How to measure interdisciplinary research? A systemic design for the model of measurement," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(8), pages 4937-4982, August.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Hong Luo & Jeffrey Macher & Michael Wahlen, 2021. "Judgment Aggregation in Creative Production: Evidence from the Movie Industry," Management Science, INFORMS, vol. 67(10), pages 6358-6377, October.
    2. Marco Ottaviani, 2020. "Grantmaking," Working Papers 672, IGIER (Innocenzo Gasparini Institute for Economic Research), Bocconi University.
    3. Schweisfurth, Tim & Zaggl, Michael A. & Schöttl, Claus P. & Raasch, Christina, 2017. "Hierarchical similarity biases in idea evaluation: A study in enterprise crowdfunding," Kiel Working Papers 2095, Kiel Institute for the World Economy (IfW Kiel).
    4. Kevin J. Boudreau & Eva C. Guinan & Karim R. Lakhani & Christoph Riedl, 2016. "Looking Across and Looking Beyond the Knowledge Frontier: Intellectual Distance, Novelty, and Resource Allocation in Science," Management Science, INFORMS, vol. 62(10), pages 2765-2783, October.
    5. Blandinieres, Florence & Pellens, Maikel, 2021. "Scientist's industry engagement and the research agenda: Evidence from Germany," ZEW Discussion Papers 21-001, ZEW - Leibniz Centre for European Economic Research.
    6. Quignon, Aurelien, 2023. "Crowd-based feedback and early-stage entrepreneurial performance: Evidence from a digital platform," Research Policy, Elsevier, vol. 52(7).
    7. Erin L. Scott & Pian Shu & Roman M. Lubynsky, 2020. "Entrepreneurial Uncertainty and Expert Evaluation: An Empirical Analysis," Management Science, INFORMS, vol. 66(3), pages 1278-1299, March.
    8. Prokudina, Elena & Renneboog, Luc & Tobler, Philippe, 2015. "Does Confidence Predict Out-of-Domain Effort?," Discussion Paper 2015-055, Tilburg University, Center for Economic Research.
    9. Joon Mahn Lee & Jung Chul Park & Guoli Chen, 2023. "A cognitive perspective on real options investment: CEO overconfidence," Strategic Management Journal, Wiley Blackwell, vol. 44(4), pages 1084-1110, April.
    10. Bharati, Rakesh & Doellman, Thomas & Fu, Xudong, 2016. "CEO confidence and stock returns," Journal of Contemporary Accounting and Economics, Elsevier, vol. 12(1), pages 89-110.
    11. Daniel P. Gross & Bhaven N. Sampat, 2022. "Crisis Innovation Policy from World War II to COVID-19," Entrepreneurship and Innovation Policy and the Economy, University of Chicago Press, vol. 1(1), pages 135-181.
    12. Laura J. Kornish & Sharaya M. Jones, 2021. "Raw Ideas in the Fuzzy Front End: Verbosity Increases Perceived Creativity," Marketing Science, INFORMS, vol. 40(6), pages 1106-1122, November.
    13. Wu, Qiang & Dbouk, Wassim & Hasan, Iftekhar & Kobeissi, Nada & Zheng, Li, 2021. "Does gender affect innovation? Evidence from female chief technology officers," Research Policy, Elsevier, vol. 50(9).
    14. Ren, Shenggang & Cheng, Yingmei & Hu, Yucai & Yin, Chao, 2021. "Feeling right at home: Hometown CEOs and firm innovation," Journal of Corporate Finance, Elsevier, vol. 66(C).
    15. Matteo Prato & Fabrizio Ferraro, 2018. "Starstruck: How Hiring High-Status Employees Affects Incumbents’ Performance," Organization Science, INFORMS, vol. 29(5), pages 755-774, October.
    16. Linda Argote & Sunkee Lee & Jisoo Park, 2021. "Organizational Learning Processes and Outcomes: Major Findings and Future Research Directions," Management Science, INFORMS, vol. 67(9), pages 5399-5429, September.
    17. Jacqueline N. Lane & Ina Ganguli & Patrick Gaule & Eva Guinan & Karim R. Lakhani, 2021. "Engineering serendipity: When does knowledge sharing lead to knowledge production?," Strategic Management Journal, Wiley Blackwell, vol. 42(6), pages 1215-1244, June.
    18. David H. Weng & Yasuhiro Yamakawa, 2023. "I believe I can fly: how target venture CEO overconfidence affects acquisition completion," Small Business Economics, Springer, vol. 61(1), pages 127-151, June.
    19. Chen, Shu & Ying, Sammy Xiaoyan & Wu, Huiying & You, Jiaxing, 2021. "Carrying on the family's legacy: Male heirs and firm innovation," Journal of Corporate Finance, Elsevier, vol. 69(C).
    20. Helen X. H. Bao & Steven Haotong Li, 2016. "Overconfidence And Real Estate Research: A Survey Of The Literature," The Singapore Economic Review (SER), World Scientific Publishing Co. Pte. Ltd., vol. 61(04), pages 1-24, September.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:inm:ormnsc:v:68:y:2022:i:6:p:4478-4495. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Chris Asher (email available below). General contact details of provider: https://edirc.repec.org/data/inforea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.