IDEAS home Printed from https://ideas.repec.org/p/osf/osfxxx/zue4h.html
   My bibliography  Save this paper

The case for default point-H1-hypotheses: a theory-construction perspective

Author

Listed:
  • Zenker, Frank

    (Lund University)

  • Witte, Erich H.

Abstract

The development of an empirically adequate theoretical construct for a given phenomenon of interest requires an estimate of the population effect size, aka the true effect. Arriving at this estimate in evidence-based ways presupposes access to robust experimental or observational findings, defined as statistically significant test-results with high statistical power. In the behavioral sciences, however, even the best journals typically publish statistically significant test-results with insufficient statistical power, entailing that such findings have insufficient replication probability. Whereas a robust finding formally requires that an empirical study engage with point-specific H0- and H1-hypotheses, behavioral scientists today typically point-specify only the H0, and instead engage a composite (directional) H1. This mismatch renders the prospects for theory-construction poor, because the population effect size—the very parameter that is to be modelled—regularly remains unknown. This can only keep from developing empirically adequate theoretical constructs. Based on the research program strategy (RPS), a sophisticated integration of Frequentist and Bayesian statistical inference elements, here we claim that theoretical progress requires engaging with point-H1-hypotheses by default.

Suggested Citation

  • Zenker, Frank & Witte, Erich H., 2021. "The case for default point-H1-hypotheses: a theory-construction perspective," OSF Preprints zue4h, Center for Open Science.
  • Handle: RePEc:osf:osfxxx:zue4h
    DOI: 10.31219/osf.io/zue4h
    as

    Download full text from publisher

    File URL: https://osf.io/download/610fac76e3801303b0965400/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/zue4h?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    2. Christopher J. Bryan & Elizabeth Tipton & David S. Yeager, 2021. "Behavioural science is unlikely to change the world without a heterogeneity revolution," Nature Human Behaviour, Nature, vol. 5(8), pages 980-989, August.
    3. Alexander Etz & Joachim Vandekerckhove, 2016. "A Bayesian Perspective on the Reproducibility Project: Psychology," PLOS ONE, Public Library of Science, vol. 11(2), pages 1-12, February.
    4. Joseph J. Locascio, 2019. "The Impact of Results Blind Science Publishing on Statistical Consultation and Collaboration," The American Statistician, Taylor & Francis Journals, vol. 73(S1), pages 346-351, March.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Valentine, Kathrene D & Buchanan, Erin Michelle & Scofield, John E. & Beauchamp, Marshall T., 2017. "Beyond p-values: Utilizing Multiple Estimates to Evaluate Evidence," OSF Preprints 9hp7y, Center for Open Science.
    2. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    3. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    4. Jyotirmoy Sarkar, 2018. "Will P†Value Triumph over Abuses and Attacks?," Biostatistics and Biometrics Open Access Journal, Juniper Publishers Inc., vol. 7(4), pages 66-71, July.
    5. John A. List, 2024. "Optimally generate policy-based evidence before scaling," Nature, Nature, vol. 626(7999), pages 491-499, February.
    6. Stanley, T. D. & Doucouliagos, Chris, 2019. "Practical Significance, Meta-Analysis and the Credibility of Economics," IZA Discussion Papers 12458, Institute of Labor Economics (IZA).
    7. Piasenti, Stefano & Valente, Marica & Van Veldhuizen, Roel & Pfeifer, Gregor, 2023. "Does Unfairness Hurt Women? The Effects of Losing Unfair Competitions," Working Papers 2023:7, Lund University, Department of Economics.
    8. Karin Langenkamp & Bodo Rödel & Kerstin Taufenbach & Meike Weiland, 2018. "Open Access in Vocational Education and Training Research," Publications, MDPI, vol. 6(3), pages 1-12, July.
    9. Sujin Park & Ali Tafti & Galit Shmueli, 2024. "Transporting Causal Effects Across Populations Using Structural Causal Modeling: An Illustration to Work-from-Home Productivity," Information Systems Research, INFORMS, vol. 35(2), pages 686-705, June.
    10. Kevin J. Boyle & Mark Morrison & Darla Hatton MacDonald & Roderick Duncan & John Rose, 2016. "Investigating Internet and Mail Implementation of Stated-Preference Surveys While Controlling for Differences in Sample Frames," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 64(3), pages 401-419, July.
    11. Jelte M Wicherts & Marjan Bakker & Dylan Molenaar, 2011. "Willingness to Share Research Data Is Related to the Strength of the Evidence and the Quality of Reporting of Statistical Results," PLOS ONE, Public Library of Science, vol. 6(11), pages 1-7, November.
    12. Anton, Roman, 2014. "Sustainable Intrapreneurship - The GSI Concept and Strategy - Unfolding Competitive Advantage via Fair Entrepreneurship," MPRA Paper 69713, University Library of Munich, Germany, revised 01 Feb 2015.
    13. Dudek, Thomas & Brenøe, Anne Ardila & Feld, Jan & Rohrer, Julia, 2022. "No Evidence That Siblings' Gender Affects Personality across Nine Countries," IZA Discussion Papers 15137, Institute of Labor Economics (IZA).
    14. Uwe Hassler & Marc‐Oliver Pohle, 2022. "Unlucky Number 13? Manipulating Evidence Subject to Snooping," International Statistical Review, International Statistical Institute, vol. 90(2), pages 397-410, August.
    15. Frederique Bordignon, 2020. "Self-correction of science: a comparative study of negative citations and post-publication peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(2), pages 1225-1239, August.
    16. Omar Al-Ubaydli & John A. List, 2015. "Do Natural Field Experiments Afford Researchers More or Less Control than Laboratory Experiments? A Simple Model," NBER Working Papers 20877, National Bureau of Economic Research, Inc.
    17. Bauer, Jan M. & Nielsen, Kristian S. & Hofmann, Wilhelm & Reisch, Lucia A., 2022. "Healthy eating in the wild: An experience-sampling study of how food environments and situational factors shape out-of-home dietary success," Social Science & Medicine, Elsevier, vol. 299(C).
    18. Aurelie Seguin & Wolfgang Forstmeier, 2012. "No Band Color Effects on Male Courtship Rate or Body Mass in the Zebra Finch: Four Experiments and a Meta-Analysis," PLOS ONE, Public Library of Science, vol. 7(6), pages 1-11, June.
    19. Ankur Moitra & Dhruv Rohatgi, 2022. "Provably Auditing Ordinary Least Squares in Low Dimensions," Papers 2205.14284, arXiv.org, revised Jun 2022.
    20. Dragana Radicic & Geoffrey Pugh & Hugo Hollanders & René Wintjes & Jon Fairburn, 2016. "The impact of innovation support programs on small and medium enterprises innovation in traditional manufacturing industries: An evaluation for seven European Union regions," Environment and Planning C, , vol. 34(8), pages 1425-1452, December.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:osfxxx:zue4h. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.