The Social Science Reproduction Platform (SSRP) is an open source platform that facilitates the sourcing, cataloging, and review of attempts to verify and improve the computational reproducibility of published social science research.
Computational reproducibility is the ability to reproduce the results, tables, and other figures of a paper using the available data, code, and materials. A related concept is replicability, the ability to corroborate a study’s findings using different data or different methods (or both).
SSRP users can 1) record the results of verifications and improvements of the computational reproducibility of published claims; 2) review, comment, and collaborate on reproduction attempts submitted by other users; and 3) access reproducibility scores aggregated across papers, journals, sub-disciplines, and timespans.
In 2019, the American Economic Association (AEA) updated its Data and Code Availability Policy to require that the AEA Data Editor verify the reproducibility of all papers before they are accepted by their journals. A similar policy was also adopted by the American Journal of Political Science, a leading journal in political science. Such policy changes are an important step in improving the computational reproducibility of published research, which several studies have shown to be alarmingly low.
SSRP provides a central catalogue of verifications of and improvements to computational reproducibility of published work in the social sciences.
Who SSRP is for
SSRP is open to all social science researchers interested in advancing the reproducibility of research. The platform may be especially relevant for:
- Students reproducing published papers as part of coursework , and instructors teaching courses on empirical research methods;
- Researchers at various career stages interested in improving computational reproducibility in the social sciences;
- Meta-researchers interested in analyzing the reproducibility of published work in the social sciences.
SSRP was developed as part of the Accelerating Computational Reproducibility in Economics (ACRE) project led by the Berkeley Initiative for Transparency in the Social Sciences (BITSS) in collaboration with the AEA Data Editor. BITSS is an initiative of the Center for Effective Global Action (CEGA), a UC Berkeley-based hub for research on global development.
The SSRP approach
Assessments of reproducibility can easily gravitate towards binary judgments that declare an entire paper “reproducible” or “not reproducible,” and are often accompanied by charged or adversarial language that can be unproductive. SSRP users instead assess the computational reproducibility of individual claims through a process based on a standardized protocol. This describes detailed steps and criteria for assessing and improving reproducibility, and offers guidance for constructive and efficient communication between reproducers and original authors. Get started in improving computational reproducibility -- one claim at a time!
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.