Skip to main content

Why Scientific Results Vary


Text: Bielefeld University

Different analytical methods have a significant impact on the results of scientific studies. This is demonstrated by a study conducted by an international research team, which includes researchers from Bielefeld University. In the study, more than 300 scientists compared 174 independent analyses of the same dataset. The findings reveal that different methods can lead to highly variable conclusions.

The study, published in BMC Biology, shows that different scientists working with the same datasets can arrive at vastly different results. This insight highlights how analytical choices can significantly influence scientific conclusions. “Our work demonstrates that scientific analyses do not solely depend on the underlying data but also on the decisions researchers make during analysis,” explains Alfredo Sánchez-Tójar from Bielefeld University’s Faculty of Biology. “This underscores the need for transparent research practices and increased replication studies.”

Alfredo Sánchez-Tójar from the Faculty of Biology at Bielefeld University and associate member of the Collaborative Research Centre TRR 212 (‘NC³’) was involved in the study.

Analysis Reveals Drastic Differences in Results

An analysis of 174 independent research groups found that various statistical methods and analytical approaches can lead to significantly differing outcomes. These findings raise fundamental questions about the reproducibility and reliability of scientific results.

The results have far-reaching implications for ecology, evolutionary biology, and beyond. Researchers at Bielefeld University emphasize the importance of Big-Team Science and open science practices to minimize biases in scientific findings. The study also confirms previous research from the university on publication bias in biology and highlights the necessity of structural changes in scientific incentives.

At the Collaborative Research Center TRR 212 (“NC³”), co-led by Bielefeld University, researchers are actively developing strategies to improve the reproducibility and reliability of scientific results. In particular, Subproject D05 focuses on transparent meta-analysis and training programs for early-career scientists. Additionally, several researchers from Bielefeld University are members of the Society for Open, Reliable, and Transparent Ecology and Evolutionary Biology (SORTEE), which advocates for sustainable reforms in science.

“No single analysis should be considered a complete or reliable answer to a research question,” says Alfredo Sánchez-Tójar. “This is why it is essential to document and disclose the methods used to process data to ensure transparency in scientific findings.”

The study has already gained widespread attention within the scientific community and is regarded as a milestone for fostering a reflective and transparent research culture.

Original publication:

Elliot Gould, Hannah S. Fraser, Timothy H. Parker, Shinichi Nakagawa et al.: Same data, different analysts: variation in effect sizes due to analytical decisions in ecology and evolutionary biology. BMC Biology. https://doi.org/10.1186/s12915-024-02101-x , published on February 6, 2025.

Privacy Policy

This website uses cookies and similar technologies. Some of these are essential to ensure the functionality of the website, while others help us to improve the website and your experience. If you consent, we also use cookies and data to measure your interactions with our website. You can view and withdraw your consent at any time with future effect at our Privacy Policy site. Here you will also find additional information about the cookies and technologies used.