Bluesky Facebook Reddit Email

Conclusions often diverge when hundreds of researchers reanalyze the same data

04.02.26 | Goethe University Frankfurt

Rigol DP832 Triple-Output Bench Power Supply

Rigol DP832 Triple-Output Bench Power Supply powers sensors, microcontrollers, and test circuits with programmable rails and stable outputs.

FRANKFURT. A new study published in Nature , “Estimating the Analytic Robustness of Social and Behavioural Sciences,” finds that scientific conclusions can shift dramatically depending on who conducts the analysis. The results come from a large-scale international collaboration led by Balázs Aczél and Barnabás Szászi (Eötvös Loránd University and Corvinus University), conducted as part of the Systematizing Confidence in Open Research and Evidence (SCORE) program. A team of 457 independent analysts from institutions around the world conducted 504 re-analyses of data from 100 previously published studies across the social and behavioral sciences. All analysts received the same dataset and the same key research question but were given freedom in how to conduct the analysis based on their informed judgment.

Over the past decade, the social and behavioral sciences have undergone substantial reforms aimed at making research more transparent, rigorous, and reliable. Preregistration, registered reports, replication studies, and checks of analytical reproducibility all seek to reduce the prevalence of chance findings and biased results. One important question, however, has received relatively little attention: to what extent do research findings depend on the specific way in which data are analyzed?

In standard scientific practice, a dataset is typically analyzed by a single researcher or research team, and the resulting publication presents the outcome of one particular analytical pathway. While peer review assesses methodological acceptability, it rarely reveals what results might have emerged under alternative, yet equally defensible, statistical decisions.

Yet empirical research involves numerous decision points: how data are cleaned, how variables are defined, which statistical models or software are used, and how results are interpreted. Together, these choices constitute what is known as analytic variability – the flexibility that can fundamentally influence final conclusions.

Key Findings
The study now published in Nature shows substantial variation in the outcomes of independent analyses of the same question using the same data across 100 studies. Although most re-analyses broadly supported the main claims of the original studies, effect sizes, statistical estimates, and levels of uncertainty often differed meaningfully. All analysts reached the same conclusion as the original authors in about one third of cases.

Importantly, these discrepancies were not due to a lack of expertise. Experienced researchers with strong statistical backgrounds were just as likely to arrive at divergent results as others. At the same time, observational studies proved less robust than experimental ones, suggesting that more complex data structures allow greater analytical flexibility – and thus greater uncertainty.

Prof. Dr. Jan Landwehr of Goethe University Frankfurt, who was involved in the study as an analyst, explains the findings: “Just as major decisions should not rest on a single study, they should not depend on a single data analysis either. Only when different, well-founded analytical approaches converge on a consistent pattern can a result be considered truly robust. In that sense, our study is also a call for stronger collaboration across research teams and for more intensive scientific exchange.”

Nature

Investigating the analytical robustness of the social and behavioural sciences

1-Apr-2026

Keywords

Article Information

Contact Information

Dr. Dirk Frank
Goethe University Frankfurt
frank@pvw.uni-frankfurt.de

Source

How to Cite This Article

APA:
Goethe University Frankfurt. (2026, April 2). Conclusions often diverge when hundreds of researchers reanalyze the same data. Brightsurf News. https://www.brightsurf.com/news/19NQ5X01/conclusions-often-diverge-when-hundreds-of-researchers-reanalyze-the-same-data.html
MLA:
"Conclusions often diverge when hundreds of researchers reanalyze the same data." Brightsurf News, Apr. 2 2026, https://www.brightsurf.com/news/19NQ5X01/conclusions-often-diverge-when-hundreds-of-researchers-reanalyze-the-same-data.html.