Post-publication reviews by ReproducibiliTea journal clubs
The steering committee provides resources to make it as easy as possible to start: a list with suggested review targets, peer review guidelines and recommendations, as well as a list of venues where commentaries could be published and a list of funding opportunities.
Because this project is still in its pilot phase (start January 2025), these documents are not publicly available yet. Interested in participating in the project? Send us an email: reproducibilitea@gmail.com and we will send you more information including resources.
See our blog post for more info about the project.
See the blog posts by ReproducibiliTea organisers [blog post 1] [blog post 2] [blog post 3].
In October 2025, we organised two webinars on the topic of peer review and its diversification (see full programme). The video recordings are available on our YouTube channel.
Resources - peer review guidelines and recommendations
- Malički, M, Mehmani, B. (2024). Structured peer review: pilot results from 23 Elsevier journals. https://doi.org/10.7717/peerj.17514 [The list of questions available in the Box 1 in the results]
- ECR Reviewers Platform https://ecr-reviewers.gitlab.io/guide/
- Toolkit on peer review by the European Association of Science Editors (EASE) https://ease.org.uk/communities/peer-review-committee/peer-review-toolkit/
- Equator Network - reporting guidelines for many study types https://www.equator-network.org/reporting-guidelines/
- Study Quality Assessment Tools https://www.nhlbi.nih.gov/health-topics/study-quality-assessment-tools
- The Catalogue of Bias https://catalogofbias.org/biases/
- The ‘REAPPRAISED’ checklist for evaluation of publication integrity http://resource-cms.springernature.com/springer-cms/rest/v1/content/17589730/data/v1
- Corneille et al. (2023). Beware ‘persuasive communication devices’ when writing and reading scientific articles. https://doi.org/10.7554/eLife.88654 [Table 1. Persuasive communication devices]
- Vazire et al. (2022). Credibility Beyond Replicability: Improving the Four Validities in Psychological Science. https://doi.org/10.1177/09637214211067779 & online tool: https://www.seaboat.io/
- Holford et al. (2024). Engaging undergraduate students in preprint peer review. https://doi.org/10.1177/14697874241264495 - see Supplemental Appendix 1 for the list of questions, or OSF folder: https://osf.io/7yqv3
- Bekkers, R. (2020, May 2). How to review a paper - Including a checklist for hypothesis testing research reports. https://doi.org/10.31219/osf.io/7ug4w
- Davis et al. (2018). Peer-review guidelines promoting replicability and transparency in psychological science. https://doi.org/10.1177/2515245918806489
- Flake, J. K., & Fried, E. I. (2020). Measurement schmeasurement: Questionable measurement practices and how to avoid them. https://doi.org/10.1177/2515245920952393 [Table 1. Six Questions to Promote Transparent Reporting of Measurement Practices]
- Hahn et al. (2024, September 24). More Than Box-ticking? Assessing Preregistration Quality in Psychological Research. https://doi.org/10.31219/osf.io/wc7qr
- Lakens, D. (2024). When and How to Deviate From a Preregistration. https://doi.org/10.1525/collabra.117094
- Bakker et al. (2020). Ensuring the quality and specificity of preregistrations. https://doi.org/10.1371/journal.pbio.3000937
- Brandt et al. (2014). The Replication Recipe: What makes for a convincing replication? https://doi.org/10.1016/j.jesp.2013.10.005
- Quantitative manuscript peer review template http://mgto.org/peerreviewtemplate
- RRR Assessment Peer Review, Collaborative Template https://mgto.org/rrrassessmentreviewtemplate