Header

UZH-Logo

Maintenance Infos

Pitfalls and potentials in simulation studies: Questionable research practices in comparative simulation studies allow for spurious claims of superiority of any method


Pawel, Samuel; Kook, Lucas; Reeve, Kelly (2024). Pitfalls and potentials in simulation studies: Questionable research practices in comparative simulation studies allow for spurious claims of superiority of any method. Biometrical Journal, 66(1):e202200091.

Abstract

Comparative simulation studies are workhorse tools for benchmarking statistical methods. As with other empirical studies, the success of simulation studies hinges on the quality of their design, execution, and reporting. If not conducted carefully and transparently, their conclusions may be misleading. In this paper, we discuss various questionable research practices, which may impact the validity of simulation studies, some of which cannot be detected or prevented by the current publication process in statistics journals. To illustrate our point, we invent a novel prediction method with no expected performance gain and benchmark it in a preregistered comparative simulation study. We show how easy it is to make the method appear superior over well‐established competitor methods if questionable research practices are employed. Finally, we provide concrete suggestions for researchers, reviewers, and other academic stakeholders for improving the methodological quality of comparative simulation studies, such as preregistering simulation protocols, incentivizing neutral simulation studies, and code and data sharing.

Abstract

Comparative simulation studies are workhorse tools for benchmarking statistical methods. As with other empirical studies, the success of simulation studies hinges on the quality of their design, execution, and reporting. If not conducted carefully and transparently, their conclusions may be misleading. In this paper, we discuss various questionable research practices, which may impact the validity of simulation studies, some of which cannot be detected or prevented by the current publication process in statistics journals. To illustrate our point, we invent a novel prediction method with no expected performance gain and benchmark it in a preregistered comparative simulation study. We show how easy it is to make the method appear superior over well‐established competitor methods if questionable research practices are employed. Finally, we provide concrete suggestions for researchers, reviewers, and other academic stakeholders for improving the methodological quality of comparative simulation studies, such as preregistering simulation protocols, incentivizing neutral simulation studies, and code and data sharing.

Statistics

Citations

Dimensions.ai Metrics
5 citations in Web of Science®
6 citations in Scopus®
Google Scholar™

Altmetrics

Downloads

6 downloads since deposited on 16 Jan 2024
6 downloads since 12 months
Detailed statistics

Additional indexing

Item Type:Journal Article, refereed, original work
Communities & Collections:04 Faculty of Medicine > Epidemiology, Biostatistics and Prevention Institute (EBPI)
Dewey Decimal Classification:610 Medicine & health
Scopus Subject Areas:Physical Sciences > Statistics and Probability
Social Sciences & Humanities > Statistics, Probability and Uncertainty
Uncontrolled Keywords:Statistics, Probability and Uncertainty, General Medicine, Statistics and Probability
Language:English
Date:January 2024
Deposited On:16 Jan 2024 09:41
Last Modified:30 Jun 2024 01:37
Publisher:Wiley-Blackwell Publishing, Inc.
ISSN:0323-3847
OA Status:Hybrid
Free access at:Publisher DOI. An embargo period may apply.
Publisher DOI:https://doi.org/10.1002/bimj.202200091
PubMed ID:36890629
Project Information:
  • : FunderSNSF
  • : Grant ID189295
  • : Project TitleReverse-Bayes Design and Analysis of Replication Studies
  • Content: Published Version
  • Language: English
  • Licence: Creative Commons: Attribution 4.0 International (CC BY 4.0)