Header

UZH-Logo

Maintenance Infos

A new standard for the analysis and design of replication studies


Held, Leonhard (2020). A new standard for the analysis and design of replication studies. Journal of the Royal Statistical Society: Series A, 183(2):431-448.

Abstract

A new standard is proposed for the evidential assessment of replication studies. The approach combines a specific reverse Bayes technique with prior‐predictive tail probabilities to define replication success. The method gives rise to a quantitative measure for replication success, called the sceptical p‐value. The sceptical p‐value integrates traditional significance of both the original and the replication study with a comparison of the respective effect sizes. It incorporates the uncertainty of both the original and the replication effect estimates and reduces to the ordinary p‐value of the replication study if the uncertainty of the original effect estimate is ignored. The framework proposed can also be used to determine the power or the required replication sample size to achieve replication success. Numerical calculations highlight the difficulty of achieving replication success if the evidence from the original study is only suggestive. An application to data from the Open Science Collaboration project on the replicability of psychological science illustrates the methodology proposed.

Abstract

A new standard is proposed for the evidential assessment of replication studies. The approach combines a specific reverse Bayes technique with prior‐predictive tail probabilities to define replication success. The method gives rise to a quantitative measure for replication success, called the sceptical p‐value. The sceptical p‐value integrates traditional significance of both the original and the replication study with a comparison of the respective effect sizes. It incorporates the uncertainty of both the original and the replication effect estimates and reduces to the ordinary p‐value of the replication study if the uncertainty of the original effect estimate is ignored. The framework proposed can also be used to determine the power or the required replication sample size to achieve replication success. Numerical calculations highlight the difficulty of achieving replication success if the evidence from the original study is only suggestive. An application to data from the Open Science Collaboration project on the replicability of psychological science illustrates the methodology proposed.

Statistics

Citations

Dimensions.ai Metrics
15 citations in Web of Science®
16 citations in Scopus®
Google Scholar™

Altmetrics

Downloads

16 downloads since deposited on 13 Jan 2021
9 downloads since 12 months
Detailed statistics

Additional indexing

Item Type:Journal Article, refereed, original work
Communities & Collections:04 Faculty of Medicine > Epidemiology, Biostatistics and Prevention Institute (EBPI)
Special Collections > Centers of Competence > Center for Reproducible Science
Dewey Decimal Classification:610 Medicine & health
Scopus Subject Areas:Physical Sciences > Statistics and Probability
Social Sciences & Humanities > Social Sciences (miscellaneous)
Social Sciences & Humanities > Economics and Econometrics
Social Sciences & Humanities > Statistics, Probability and Uncertainty
Uncontrolled Keywords:Statistics, Probability and Uncertainty, Economics and Econometrics, Statistics and Probability, Social Sciences (miscellaneous)
Language:English
Date:1 February 2020
Deposited On:13 Jan 2021 16:28
Last Modified:27 Jan 2022 04:08
Publisher:Wiley-Blackwell Publishing, Inc.
ISSN:0964-1998
OA Status:Hybrid
Free access at:Publisher DOI. An embargo period may apply.
Publisher DOI:https://doi.org/10.1111/rssa.12493
Related URLs:https://www.zora.uzh.ch/id/eprint/175101/

Download

Hybrid Open Access

Download PDF  'A new standard for the analysis and design of replication studies'.
Preview
Content: Published Version
Filetype: PDF
Size: 699kB
View at publisher
Licence: Creative Commons: Attribution-NonCommercial 4.0 International (CC BY-NC 4.0)