Header

UZH-Logo

Maintenance Infos

Many Analysts, One Data Set: Making Transparent How Variations in Analytic Choices Affect Results


Silberzahn, R; Uhlmann, E L; Martin, D P; Ullrich, Johannes (2018). Many Analysts, One Data Set: Making Transparent How Variations in Analytic Choices Affect Results. Advances in Methods and Practices in Psychological Science, 1(3):337-356.

Abstract

Twenty-nine teams involving 61 analysts used the same data set to address the same research question: whether soccer referees are more likely to give red cards to dark-skin-toned players than to light-skin-toned players. Analytic approaches varied widely across the teams, and the estimated effect sizes ranged from 0.89 to 2.93 (Mdn = 1.31) in odds-ratio units. Twenty teams (69%) found a statistically significant positive effect, and 9 teams (31%) did not observe a significant relationship. Overall, the 29 different analyses used 21 unique combinations of covariates. Neither analysts’ prior beliefs about the effect of interest nor their level of expertise readily explained the variation in the outcomes of the analyses. Peer ratings of the quality of the analyses also did not account for the variability. These findings suggest that significant variation in the results of analyses of complex data may be difficult to avoid, even by experts with honest intentions. Crowdsourcing data analysis, a strategy in which numerous research teams are recruited to simultaneously investigate the same research question, makes transparent how defensible, yet subjective, analytic choices influence research results.

Abstract

Twenty-nine teams involving 61 analysts used the same data set to address the same research question: whether soccer referees are more likely to give red cards to dark-skin-toned players than to light-skin-toned players. Analytic approaches varied widely across the teams, and the estimated effect sizes ranged from 0.89 to 2.93 (Mdn = 1.31) in odds-ratio units. Twenty teams (69%) found a statistically significant positive effect, and 9 teams (31%) did not observe a significant relationship. Overall, the 29 different analyses used 21 unique combinations of covariates. Neither analysts’ prior beliefs about the effect of interest nor their level of expertise readily explained the variation in the outcomes of the analyses. Peer ratings of the quality of the analyses also did not account for the variability. These findings suggest that significant variation in the results of analyses of complex data may be difficult to avoid, even by experts with honest intentions. Crowdsourcing data analysis, a strategy in which numerous research teams are recruited to simultaneously investigate the same research question, makes transparent how defensible, yet subjective, analytic choices influence research results.

Statistics

Citations

Dimensions.ai Metrics

Altmetrics

Downloads

48 downloads since deposited on 11 Dec 2018
25 downloads since 12 months
Detailed statistics

Additional indexing

Item Type:Journal Article, refereed, original work
Communities & Collections:06 Faculty of Arts > Institute of Psychology
Dewey Decimal Classification:150 Psychology
Language:English
Date:1 September 2018
Deposited On:11 Dec 2018 13:53
Last Modified:29 Jul 2020 08:25
Publisher:Sage Publications Ltd.
ISSN:2515-2467
OA Status:Hybrid
Free access at:Publisher DOI. An embargo period may apply.
Publisher DOI:https://doi.org/10.1177/2515245917747646
Related URLs:https://www.zora.uzh.ch/id/eprint/148470/
https://psyarxiv.com/qkwst/
https://doi.org/10.1177/2515245918810511

Download

Hybrid Open Access

Download PDF  'Many Analysts, One Data Set: Making Transparent How Variations in Analytic Choices Affect Results'.
Preview
Content: Published Version
Filetype: PDF
Size: 239kB
View at publisher
Licence: Creative Commons: Attribution-NonCommercial 4.0 International (CC BY-NC 4.0)