Daily Archives: 20 April 2026

Can We Rely on the Science That Shapes Our World?

Published in Nature, this investigation led by Abel Brodeur, a professor at the University of Ottawa, emerged alongside a sweeping seven-year international effort examining whether academic findings endure over time—particularly in fields lacking a standardised measure of scientific credibility. That broader initiative, the Systematizing Confidence in Open Research and Evidence (SCORE) project, assessed nearly 4,000 social-science papers and concluded that roughly half of the tested studies could not be successfully replicated, underscoring ongoing concerns about the reliability of published research.

Against this backdrop, Brodeur’s own work offers a more hopeful perspective. While the SCORE findings highlighted structural weaknesses in reproducibility across the social sciences, his study suggests that more recent research practices may be moving in a positive direction. Rather than reinforcing scepticism, his results point to tangible improvements in transparency and methodological rigour, especially within certain disciplines and publication environments.

Brodeur adopted a dual methodological approach, organising focused replication exercises conducted during single-day events between 2022 and 2023. Through this process, his team examined 110 published articles, ultimately finding that approximately 85 per cent were computationally reproducible. This figure stands in notable contrast to earlier replication rates and provides a degree of reassurance at a time when public confidence in scientific evidence is under strain.

Reflecting on these findings, Brodeur emphasises that the study contributes to a growing body of systematic, large-scale evidence regarding the reliability of social science research. He argues that its immediate impact lies in reinforcing the importance of stronger research practices, including improved coding standards, more consistent data sharing, and greater overall transparency. These measures, he suggests, can help identify and correct errors before they influence policy decisions. Over the longer term, such openness may also rebuild trust in science by demonstrating its capacity for self-correction and accountability.

A key distinction between Brodeur’s work and the earlier SCORE project lies in the evolution of disclosure practices. His analysis indicates that more recent studies—particularly those published after 2018—are more likely to include accessible data and code, reflecting a shift towards open science norms. Brodeur contends that reproducing original research should become a routine expectation rather than an exceptional exercise. He further recommends that future studies draw broader conclusions by examining random samples of papers from journals with varying data-sharing policies, thereby providing a more comprehensive view of reproducibility across the academic landscape.

Beyond methodological considerations, Brodeur highlights the broader implications for equity and access within the research community. The increased availability of data, code, and open-source tools can democratise participation in scientific inquiry, enabling researchers from less well-resourced institutions or regions to engage more fully with cutting-edge work. In this sense, open science is not merely a technical improvement but a structural shift with the potential to reshape who can contribute to—and benefit from—academic knowledge. As reliance on research continues to grow in both public policy and everyday decision-making, such changes may play a critical role in strengthening both the credibility and inclusiveness of the scientific enterprise.

More information: Abel Brodeur et al, Reproducibility and robustness of economics and political science research, Nature. DOI: 10.1038/s41586-026-10251-x

Journal information: Nature Provided by University of Ottawa