IDEAS home Printed from https://ideas.repec.org/p/osf/metaar/67sak_v1.html
   My bibliography  Save this paper

The "Tau" of Science - How to Measure, Study, and Integrate Quantitative and Qualitative Knowledge

Author

Listed:
  • Fanelli, Daniele

Abstract

Scientists' ability to integrate diverse forms of evidence and evaluate how well they can explain and predict phenomena, in other words, $\textit{to know how much they know}$, struggles to keep pace with technological innovation. Central to the challenge of extracting knowledge from data is the need to develop a metric of knowledge itself. A candidate metric of knowledge, $K$, was recently proposed by the author. This essay further advances and integrates that proposal, by developing a methodology to measure its key variable, symbolized with the Greek letter $\tau$ ("tau"). It will be shown how a $\tau$ can represent the description of any phenomenon, any theory to explain it, and any methodology to study it, allowing the knowledge about that phenomenon to be measured with $K$. To illustrate potential applications, the essay calculates $\tau$ and $K$ values of: logical syllogisms and proofs, mathematical calculations, empirical quantitative knowledge, statistical model selection problems, including how to correct for "forking paths" and "P-hacking" biases, randomised controlled experiments, reproducibility and replicability, qualitative analyses via process tracing, and mixed quantitative and qualitative evidence. Whilst preliminary in many respects, these results suggest that $K$ theory offers a meaningful understanding of knowledge, which makes testable metascientific predictions, and which may be used to analyse and integrate qualitative and quantitative evidence to tackle complex problems.

Suggested Citation

  • Fanelli, Daniele, 2022. "The "Tau" of Science - How to Measure, Study, and Integrate Quantitative and Qualitative Knowledge," MetaArXiv 67sak_v1, Center for Open Science.
  • Handle: RePEc:osf:metaar:67sak_v1
    DOI: 10.31219/osf.io/67sak_v1
    as

    Download full text from publisher

    File URL: https://osf.io/download/61d7ffe5da63201206fe6b5a/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/67sak_v1?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Rubin, Mark, 2020. "Does preregistration improve the credibility of research findings?," MetaArXiv vgr89, Center for Open Science.
    2. Bart Penders & J. Britt Holbrook & Sarah de Rijcke, 2019. "Rinse and Repeat: Understanding the Value of Replication across Different Ways of Knowing," Publications, MDPI, vol. 7(3), pages 1-15, July.
    3. Alexander Etz & Joachim Vandekerckhove, 2016. "A Bayesian Perspective on the Reproducibility Project: Psychology," PLOS ONE, Public Library of Science, vol. 11(2), pages 1-12, February.
    4. Fairfield, Tasha & Charman, Andrew E., 2017. "Explicit Bayesian Analysis for Process Tracing: Guidelines, Opportunities, and Caveats," Political Analysis, Cambridge University Press, vol. 25(3), pages 363-380, July.
    5. Fairfield, Tasha & Charman, Andrew, 2017. "Explicit Bayesian analysis for process tracing: guidelines, opportunities, and caveats," LSE Research Online Documents on Economics 69203, London School of Economics and Political Science, LSE Library.
    6. Ronald L. Wasserstein & Nicole A. Lazar, 2016. "The ASA's Statement on p -Values: Context, Process, and Purpose," The American Statistician, Taylor & Francis Journals, vol. 70(2), pages 129-133, May.
    7. Freese, Jeremy & Peterson, David, 2017. "Replication in Social Science," SocArXiv 5bck9, Center for Open Science.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Fanelli, Daniele, 2022. "The "Tau" of Science - How to Measure, Study, and Integrate Quantitative and Qualitative Knowledge," MetaArXiv 67sak, Center for Open Science.
    2. Fanelli, Daniele, 2020. "Metascientific reproducibility patterns revealed by informatic measure of knowledge," MetaArXiv 5vnhj, Center for Open Science.
    3. Alejandro Avenburg & John Gerring & Jason Seawright, 2023. "How do social scientists reach causal inferences? A study of reception," Quality & Quantity: International Journal of Methodology, Springer, vol. 57(1), pages 257-275, February.
    4. Martin Rabbia, 2023. "Why did Argentina and Uruguay decide to pursue a carbon tax? Fiscal reforms and explicit carbon prices," Review of Policy Research, Policy Studies Organization, vol. 40(2), pages 230-259, March.
    5. Fernández Milmanda, Belén & Garay, Candelaria, 2019. "Subnational variation in forest protection in the Argentine Chaco," World Development, Elsevier, vol. 118(C), pages 79-90.
    6. Brandão, Frederico & Befani, Barbara & Soares-Filho, Jaílson & Rajão, Raoni & Garcia, Edenise, 2023. "How to halt deforestation in the Amazon? A Bayesian process-tracing approach," Land Use Policy, Elsevier, vol. 133(C).
    7. Fairfield, Tasha & Charman, Andrew, 2019. "A Dialogue with the Data: the Bayesian foundations of iterative research in qualitative social science," LSE Research Online Documents on Economics 89261, London School of Economics and Political Science, LSE Library.
    8. Alvarado, Miriam & Penney, Tarra L. & Unwin, Nigel & Murphy, Madhuvanti M. & Adams, Jean, 2021. "Evidence of a health risk ‘signalling effect’ following the introduction of a sugar-sweetened beverage tax," Food Policy, Elsevier, vol. 102(C).
    9. David A. Bateman & Dawn Langan Teele, 2020. "A developmental approach to historical causal inference," Public Choice, Springer, vol. 185(3), pages 253-279, December.
    10. Konstantinos Bourazas & Guido Consonni & Laura Deldossi, 2024. "Bayesian sample size determination for detecting heterogeneity in multi-site replication studies," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 33(3), pages 697-716, September.
    11. Amengual, Matthew, 2018. "Buying stability: The distributive outcomes of private politics in the Bolivian mining industry," World Development, Elsevier, vol. 104(C), pages 31-45.
    12. Rosa W. Runhardt, 2024. "Concrete Counterfactual Tests for Process Tracing: Defending an Interventionist Potential Outcomes Framework," Sociological Methods & Research, , vol. 53(4), pages 1591-1628, November.
    13. Aaron Deslatte & Serena Kim & Christopher V. Hawkins & Eric Stokan, 2024. "Keeping policy commitments: An organizational capability approach to local green housing equity," Review of Policy Research, Policy Studies Organization, vol. 41(1), pages 135-159, January.
    14. Ugofilippo Basellini, 2024. "Open science practices in demographic research: An appraisal," Demographic Research, Max Planck Institute for Demographic Research, Rostock, Germany, vol. 50(43), pages 1265-1280.
    15. Blair, Graeme & Cooper, Jasper & Coppock, Alexander & Humphreys, Macartan, 2019. "Declaring and Diagnosing Research Designs," EconStor Open Access Articles and Book Chapters, ZBW - Leibniz Information Centre for Economics, vol. 113(3), pages 838-859.
    16. Jyotirmoy Sarkar, 2018. "Will P†Value Triumph over Abuses and Attacks?," Biostatistics and Biometrics Open Access Journal, Juniper Publishers Inc., vol. 7(4), pages 66-71, July.
    17. Segurado, Pedro & Gutiérrez-Cánovas, Cayetano & Ferreira, Teresa & Branco, Paulo, 2022. "Stressor gradient coverage affects interaction identification," Ecological Modelling, Elsevier, vol. 472(C).
    18. Abramson, Corey & Li, Zhuofan, 2024. "Ethnography and Machine Learning: Synergies and New Directions," OSF Preprints jvpbw_v1, Center for Open Science.
    19. Gergely Ganics & Atsushi Inoue & Barbara Rossi, 2021. "Confidence Intervals for Bias and Size Distortion in IV and Local Projections-IV Models," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 39(1), pages 307-324, January.
    20. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:metaar:67sak_v1. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/metaarxiv .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.