IDEAS home Printed from https://ideas.repec.org/p/osf/osfxxx/zue4h.html
   My bibliography  Save this paper

The case for default point-H1-hypotheses: a theory-construction perspective

Author

Listed:
  • Zenker, Frank

    (Lund University)

  • Witte, Erich H.

Abstract

The development of an empirically adequate theoretical construct for a given phenomenon of interest requires an estimate of the population effect size, aka the true effect. Arriving at this estimate in evidence-based ways presupposes access to robust experimental or observational findings, defined as statistically significant test-results with high statistical power. In the behavioral sciences, however, even the best journals typically publish statistically significant test-results with insufficient statistical power, entailing that such findings have insufficient replication probability. Whereas a robust finding formally requires that an empirical study engage with point-specific H0- and H1-hypotheses, behavioral scientists today typically point-specify only the H0, and instead engage a composite (directional) H1. This mismatch renders the prospects for theory-construction poor, because the population effect size—the very parameter that is to be modelled—regularly remains unknown. This can only keep from developing empirically adequate theoretical constructs. Based on the research program strategy (RPS), a sophisticated integration of Frequentist and Bayesian statistical inference elements, here we claim that theoretical progress requires engaging with point-H1-hypotheses by default.

Suggested Citation

  • Zenker, Frank & Witte, Erich H., 2021. "The case for default point-H1-hypotheses: a theory-construction perspective," OSF Preprints zue4h, Center for Open Science.
  • Handle: RePEc:osf:osfxxx:zue4h
    DOI: 10.31219/osf.io/zue4h
    as

    Download full text from publisher

    File URL: https://osf.io/download/610fac76e3801303b0965400/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/zue4h?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    2. Joseph J. Locascio, 2019. "The Impact of Results Blind Science Publishing on Statistical Consultation and Collaboration," The American Statistician, Taylor & Francis Journals, vol. 73(S1), pages 346-351, March.
    3. Christopher J. Bryan & Elizabeth Tipton & David S. Yeager, 2021. "Behavioural science is unlikely to change the world without a heterogeneity revolution," Nature Human Behaviour, Nature, vol. 5(8), pages 980-989, August.
    4. Alexander Etz & Joachim Vandekerckhove, 2016. "A Bayesian Perspective on the Reproducibility Project: Psychology," PLOS ONE, Public Library of Science, vol. 11(2), pages 1-12, February.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    2. Valentine, Kathrene D & Buchanan, Erin Michelle & Scofield, John E. & Beauchamp, Marshall T., 2017. "Beyond p-values: Utilizing Multiple Estimates to Evaluate Evidence," OSF Preprints 9hp7y, Center for Open Science.
    3. Jyotirmoy Sarkar, 2018. "Will P†Value Triumph over Abuses and Attacks?," Biostatistics and Biometrics Open Access Journal, Juniper Publishers Inc., vol. 7(4), pages 66-71, July.
    4. John A. List, 2024. "Optimally generate policy-based evidence before scaling," Nature, Nature, vol. 626(7999), pages 491-499, February.
    5. Piasenti, Stefano & Valente, Marica & van Veldhuizen, Roel & Pfeifer, Gregor, 2023. "Does Unfairness Hurt Women? The Effects of Losing Unfair Competitions," IZA Discussion Papers 16324, Institute of Labor Economics (IZA).
    6. Sujin Park & Ali Tafti & Galit Shmueli, 2024. "Transporting Causal Effects Across Populations Using Structural Causal Modeling: An Illustration to Work-from-Home Productivity," Information Systems Research, INFORMS, vol. 35(2), pages 686-705, June.
    7. Kevin J. Boyle & Mark Morrison & Darla Hatton MacDonald & Roderick Duncan & John Rose, 2016. "Investigating Internet and Mail Implementation of Stated-Preference Surveys While Controlling for Differences in Sample Frames," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 64(3), pages 401-419, July.
    8. Jelte M Wicherts & Marjan Bakker & Dylan Molenaar, 2011. "Willingness to Share Research Data Is Related to the Strength of the Evidence and the Quality of Reporting of Statistical Results," PLOS ONE, Public Library of Science, vol. 6(11), pages 1-7, November.
    9. Frederique Bordignon, 2020. "Self-correction of science: a comparative study of negative citations and post-publication peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(2), pages 1225-1239, August.
    10. Omar Al-Ubaydli & John A. List, 2015. "Do Natural Field Experiments Afford Researchers More or Less Control than Laboratory Experiments? A Simple Model," NBER Working Papers 20877, National Bureau of Economic Research, Inc.
    11. Aurelie Seguin & Wolfgang Forstmeier, 2012. "No Band Color Effects on Male Courtship Rate or Body Mass in the Zebra Finch: Four Experiments and a Meta-Analysis," PLOS ONE, Public Library of Science, vol. 7(6), pages 1-11, June.
    12. Dragana Radicic & Geoffrey Pugh & Hugo Hollanders & René Wintjes & Jon Fairburn, 2016. "The impact of innovation support programs on small and medium enterprises innovation in traditional manufacturing industries: An evaluation for seven European Union regions," Environment and Planning C, , vol. 34(8), pages 1425-1452, December.
    13. Bauer, Jan M. & Aarestrup, Simon C. & Hansen, Pelle G. & Reisch, Lucia A., 2022. "Nudging more sustainable grocery purchases: Behavioural innovations in a supermarket setting," Technological Forecasting and Social Change, Elsevier, vol. 179(C).
    14. Li, Lunzheng & Maniadis, Zacharias & Sedikides, Constantine, 2021. "Anchoring in Economics: A Meta-Analysis of Studies on Willingness-To-Pay and Willingness-To-Accept," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 90(C).
    15. Lars Behlen & Oliver Himmler & Robert Jäckle, 2023. "Defaults and effortful tasks," Experimental Economics, Springer;Economic Science Association, vol. 26(5), pages 1022-1059, November.
    16. Diekmann Andreas, 2011. "Are Most Published Research Findings False?," Journal of Economics and Statistics (Jahrbuecher fuer Nationaloekonomie und Statistik), De Gruyter, vol. 231(5-6), pages 628-635, October.
    17. Daniele Fanelli, 2012. "Negative results are disappearing from most disciplines and countries," Scientometrics, Springer;Akadémiai Kiadó, vol. 90(3), pages 891-904, March.
    18. Kirthi Kalyanam & John McAteer & Jonathan Marek & James Hodges & Lifeng Lin, 2018. "Cross channel effects of search engine advertising on brick & mortar retail sales: Meta analysis of large scale field experiments on Google.com," Quantitative Marketing and Economics (QME), Springer, vol. 16(1), pages 1-42, March.
    19. Nazila Alinaghi & W. Robert Reed, 2021. "Taxes and Economic Growth in OECD Countries: A Meta-analysis," Public Finance Review, , vol. 49(1), pages 3-40, January.
    20. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:osfxxx:zue4h. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.