IDEAS home Printed from https://ideas.repec.org/a/gam/jpubli/v11y2023i1p14-d1087282.html
   My bibliography  Save this article

Can Retracted Social Science Articles Be Distinguished from Non-Retracted Articles by Some of the Same Authors, Using Benford’s Law or Other Statistical Methods?

Author

Listed:
  • Walter R. Schumm

    (Department of Applied Human Sciences, Kansas State University, 1700 Anderson Avenue, Manhattan, KS 66506, USA)

  • Duane W. Crawford

    (Department of Applied Human Sciences, Kansas State University, 1700 Anderson Avenue, Manhattan, KS 66506, USA)

  • Lorenza Lockett

    (Department of Sociology, Anthropology, and Social Work, Kansas State University, 1603 Old Claflin Place, Manhattan, KS 66506, USA)

  • Asma bin Ateeq

    (Education Department, Arab East Colleges, 3310 Abdullah bin Umar, Al Qirawan, Riyahd 13544-6394, Saudi Arabia)

  • Abdullah AlRashed

    (Security Studies Program, Graduate School, Kansas State University, Fairchild Hall, 1700 Anderson Avenue, Manhattan, KS 66506, USA)

Abstract

A variety of ways to detect problems in small sample social science surveys has been discussed by a variety of authors. Here, several new approaches for detecting anomalies in large samples are presented and their use illustrated through comparisons of seven retracted or corrected journal articles with a control group of eight articles published since 2000 by a similar group of authors on similar topics; all the articles involved samples from several hundred to many thousands of participants. Given the small sample of articles (k = 15) and low statistical power, only 2/12 of individual anomaly comparisons were not statistically significant, but large effect sizes ( d > 0.80) were common for most of the anomaly comparisons. A six-item total anomaly scale featured a Cronbach alpha of 0.92, suggesting that the six anomalies were moderately correlated rather than isolated issues. The total anomaly scale differentiated the two groups of articles, with an effect size of 3.55 ( p < 0.001); an anomaly severity scale derived from the same six items, with an alpha of 0.94, yielded an effect size of 3.52 ( p < 0.001). Deviations from the predicted distribution of first digits in regression coefficients (Benford’s Law) were associated with anomalies and differences between the two groups of articles; however, the results were mixed in terms of statistical significance, though the effect sizes were large ( d ≥ 0.90). The methodology was able to detect unusual anomalies in both retracted and non-retracted articles. In conclusion, the results provide several useful approaches that may be helpful for detecting questionable research practices, especially data or results fabrication, in social science, medical, or other scientific research.

Suggested Citation

  • Walter R. Schumm & Duane W. Crawford & Lorenza Lockett & Asma bin Ateeq & Abdullah AlRashed, 2023. "Can Retracted Social Science Articles Be Distinguished from Non-Retracted Articles by Some of the Same Authors, Using Benford’s Law or Other Statistical Methods?," Publications, MDPI, vol. 11(1), pages 1-13, March.
  • Handle: RePEc:gam:jpubli:v:11:y:2023:i:1:p:14-:d:1087282
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2304-6775/11/1/14/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2304-6775/11/1/14/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Bauer Johannes & Groß Jochen, 2011. "Difficulties Detecting Fraud? The Use of Benford’s Law on Regression Tables," Journal of Economics and Statistics (Jahrbuecher fuer Nationaloekonomie und Statistik), De Gruyter, vol. 231(5-6), pages 733-748, October.
    2. Daniele Fanelli, 2009. "How Many Scientists Fabricate and Falsify Research? A Systematic Review and Meta-Analysis of Survey Data," PLOS ONE, Public Library of Science, vol. 4(5), pages 1-11, May.
    3. R Grant Steen & Arturo Casadevall & Ferric C Fang, 2013. "Why Has the Number of Scientific Retractions Increased?," PLOS ONE, Public Library of Science, vol. 8(7), pages 1-9, July.
    4. Nicole Shu Ling Yeo-Teh & Bor Luen Tang, 2022. "Sustained Rise in Retractions in the Life Sciences Literature during the Pandemic Years 2020 and 2021," Publications, MDPI, vol. 10(3), pages 1-12, August.
    5. Andreas Diekmann, 2007. "Not the First Digit! Using Benford's Law to Detect Fraudulent Scientif ic Data," Journal of Applied Statistics, Taylor & Francis Journals, vol. 34(3), pages 321-329.
    6. Horton, Joanne & Krishna Kumar, Dhanya & Wood, Anthony, 2020. "Detecting academic fraud using Benford law: The case of Professor James Hunton," Research Policy, Elsevier, vol. 49(8).
    7. Justin T. Pickett, 2020. "The Stewart Retractions: A Quantitative and Qualitative Analysis," Econ Journal Watch, Econ Journal Watch, vol. 17(1), pages 152–190-1, March.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Horton, Joanne & Krishna Kumar, Dhanya & Wood, Anthony, 2020. "Detecting academic fraud using Benford law: The case of Professor James Hunton," Research Policy, Elsevier, vol. 49(8).
    2. Kiran Sharma, 2021. "Team size and retracted citations reveal the patterns of retractions from 1981 to 2020," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(10), pages 8363-8374, October.
    3. Gonzalo Marco-Cuenca & José Antonio Salvador-Oliván & Rosario Arquero-Avilés, 2021. "Fraud in scientific publications in the European Union. An analysis through their retractions," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(6), pages 5143-5164, June.
    4. Bruno S. Frey, 2010. "Withering academia?," IEW - Working Papers 512, Institute for Empirical Research in Economics - University of Zurich.
    5. Caroline Lievore & Priscila Rubbo & Celso Biynkievycz Santos & Claudia Tânia Picinin & Luiz Alberto Pilatti, 2021. "Research ethics: a profile of retractions from world class universities," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(8), pages 6871-6889, August.
    6. Tariq Ahmad Shah & Sumeer Gul & Saimah Bashir & Suhail Ahmad & Assumpció Huertas & Andrea Oliveira & Farzana Gulzar & Ashaq Hussain Najar & Kanu Chakraborty, 2021. "Influence of accessibility (open and toll-based) of scholarly publications on retractions," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(6), pages 4589-4606, June.
    7. Teddy Lazebnik & Dan Gorlitsky, 2023. "Can We Mathematically Spot the Possible Manipulation of Results in Research Manuscripts Using Benford’s Law?," Data, MDPI, vol. 8(11), pages 1-11, October.
    8. Catalin Toma & Liliana Padureanu & Bogdan Toma, 2022. "Correction of the Scientific Production: Publisher Performance Evaluation Using a Dataset of 4844 PubMed Retractions," Publications, MDPI, vol. 10(2), pages 1-25, April.
    9. Ali Ghorbi & Mohsen Fazeli-Varzaneh & Erfan Ghaderi-Azad & Marcel Ausloos & Marcin Kozak, 2021. "Retracted papers by Iranian authors: causes, journals, time lags, affiliations, collaborations," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(9), pages 7351-7371, September.
    10. H. Latan & C.J. Chiappetta Jabbour & Ana Beatriz Lopes de Sousa Jabbour & M. Ali, 2023. "Crossing the Red Line? Empirical Evidence and Useful Recommendations on Questionable Research Practices among Business Scholars," Post-Print hal-04276024, HAL.
    11. Love, Peter E.D. & Ika, Lavagnon A. & Ahiaga-Dagbui, Dominic D., 2019. "On de-bunking ‘fake news’ in a post truth era: Why does the Planning Fallacy explanation for cost overruns fall short?," Transportation Research Part A: Policy and Practice, Elsevier, vol. 126(C), pages 397-408.
    12. Jeremy Hall & Ben R. Martin, 2019. "Towards a Taxonomy of Academic Misconduct: The Case of Business School Research," SPRU Working Paper Series 2019-02, SPRU - Science Policy Research Unit, University of Sussex Business School.
    13. Robert J Warren II & Joshua R King & Charlene Tarsa & Brian Haas & Jeremy Henderson, 2017. "A systematic review of context bias in invasion biology," PLOS ONE, Public Library of Science, vol. 12(8), pages 1-12, August.
    14. Jasper Brinkerink, 2023. "When Shooting for the Stars Becomes Aiming for Asterisks: P-Hacking in Family Business Research," Entrepreneurship Theory and Practice, , vol. 47(2), pages 304-343, March.
    15. Frederique Bordignon, 2020. "Self-correction of science: a comparative study of negative citations and post-publication peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(2), pages 1225-1239, August.
    16. Hensel, Przemysław G., 2019. "Supporting replication research in management journals: Qualitative analysis of editorials published between 1970 and 2015," European Management Journal, Elsevier, vol. 37(1), pages 45-57.
    17. Judit Bar-Ilan & Gali Halevi, 2017. "Post retraction citations in context: a case study," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(1), pages 547-565, October.
    18. Sitsofe Tsagbey & Miguel de Carvalho & Garritt L. Page, 2017. "All Data are Wrong, but Some are Useful? Advocating the Need for Data Auditing," The American Statistician, Taylor & Francis Journals, vol. 71(3), pages 231-235, July.
    19. Necker, Sarah, 2014. "Scientific misbehavior in economics," Research Policy, Elsevier, vol. 43(10), pages 1747-1759.
    20. Gary Charness & David Masclet & Marie Claire Villeval, 2014. "The Dark Side of Competition for Status," Management Science, INFORMS, vol. 60(1), pages 38-55, January.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jpubli:v:11:y:2023:i:1:p:14-:d:1087282. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.