PAF: Scientific Reproducibility

By Stephanie Guerra, Kathryn Rough, L. Vivian Dong, Katriel Friedman, and Eric Gastfriend.

This is the Executive Summary of the final report from a Philanthropy Advisory Fellowship project to develop a strategy to improve reproducibility in science. The full report (redacted for client confidentiality) is available here. This research was conducted on behalf of PAF client Laura and John Arnold Foundation.

Summary of Recommendations

In this report, we considered four broad interventions: post-publication peer review, quantitative metrics for transparency (e.g. “transparency index”), tenure criteria, and characterizing dependency relationships between pieces of research. All of these categories can be implemented in different ways, and can improve reproducibility through multiple channels. Post-publication peer review makes it easier to know the results of publications (e.g. in comments on PubMed Commons), and it also could reduce bias by creating a disincentive for scientists to manipulate results (for fear of facing more scrutiny). Quantitative metrics for transparency could change the culture in science towards more open practices, which would also open up more scrutiny of results as a disincentive to introduce bias, but also make it easier to produce replications (e.g. by having access to data and code). Tenure criteria similarly could create a culture shift, by requiring professors to perform replications and/or share data, but it seems difficult for philanthropy to have a direct effect in this area. And characterizing relationships between articles in scientific research can make research more efficient and accurate by allowing researchers to track unintuitive, tertiary effects of a particular finding in a paper.

Our recommendations are:

1.    Fund and organize a conference to brainstorm and define a reproducibility metric, possibly a transparency index,

2.    Approach [redacted], a stealth-mode startup seeking to quantify the reproducibility of observations in the biological sciences, with an offer of funding in exchange for more open availability of their product, 

3.    Fund the Global Biological Standard Institute to develop training modules for scientists and track adoption of better practices in the biomedical community that will improve reproducibility,

4.    Speak with the team at Casetext, a legal innovation startup, on how their model of crowdsourcing the job of characterizing relationships between cases could be applied to science.

The full report is available here.