New Study Finds Different Researchers Draw Different Conclusions From the Same Data

Open Access paper
Open Access paper

A groundbreaking study puts a spotlight on the reliability of ecological and evolutionary biology research.

A team of more than 300 researchers co-led by corresponding author Tim Parker, Professor of Biology and Environmental Studies at Whitman College (Walla Walla, Washington) and primary co-authors Elliot Gould, a PhD student at the School of Biosciences, The 老虎机游戏_pt老虎机-平台*官网 of Melbourne, Hannah Fraser, a Postdoctoral Researcher at The 老虎机游戏_pt老虎机-平台*官网 of Melbourne, and Shinchi Nakagawa, Professor and Canada Excellence Research Chair at the 老虎机游戏_pt老虎机-平台*官网 of Alberta, has published a study that has the potential to profoundly impact the broader scientific community by highlighting the need to address analytical variability in research outcomes. Our very own Guy Sutton is one of the co-authors on this paper. 

 In this study, 174 analyst teams found strikingly variable answers to prespecified research questions using the same sets of data, demonstrating diversity in analytical decision-making while shedding light on potential sources of unreliability and bias in scientific processes. The results align with growing recognition that the many choices researchers must make—such as which statistical methods to apply—can lead to divergent conclusions even when the different options are all reasonable.

The potential for such substantial variability has major implications for how ecologists and other scientists analyze data. This paper describes several data analysis practices that researchers could adopt in response to this variability. For instance, researchers might present several different analyses of the same data to assess the similarity of outcomes across statistical models, or they might attempt more ambitious ‘multiverse’ analyses in which they generate many hundreds or thousands of analyses to explore how different choices influence outcomes. These options join an ecosystem of other proposals to promote the reliability of scientific research, many of which focus on improving transparency.

This ambitious study not only meticulously demonstrates the potential for analytical heterogeneity to substantially influence statistical outcomes in ecology and evolutionary biology, it demonstrates that this is a general concern, far beyond the social sciences where most of the prior work on this topic was conducted. 

The authors hope their findings will encourage researchers, institutions, funding agencies and journals to support initiatives aimed at improving research rigor, ultimately strengthening the reliability of scientific knowledge.

Link to study: https://doi.org/10.1186/s12915-024-02101-x