Chapter 17 Reporting & Reproducibility

In the previous chapters, we covered various techniques, approaches and strategies we can use to conduct a meta-analysis in R. Nevertheless, the statistical procedures we covered in our guide only make up a small proportion of the entire meta-analysis process in real life. In meta-analysis projects “in the wild”, it could happen that:

  • You find an error in your meta-analysis R code, and therefore have to replicate some steps of your analysis with a few changes,
  • Collaborators or reviewers suggest using a different approach or model, or performing a sensitivity analysis on your data,
  • You want to delegate some parts of the analysis to one of your collaborators, and have to send her/him the current status of your analysis,
  • You have to stop working on your project for some time, which means that you might forget some of the steps of your analysis before you return to the project,
  • You want to share results of your analysis with project collaborators, but they may not know R and thus may not have RStudio installed.

These are just a few scenarios, but it should have become clear that creating a reproducible work flow when conducting meta-analyses in R is beneficial to both you and others. Aiming for reproducibility is also a cornerstone of Open Science practices. Making your meta-analysis fully reproducible creates complete transparency for other researchers concerning how you came to the results of your meta-analysis. RStudio is an optimal tool to create a reproducible workflow and facilitate cooperation on one project. Here, we will introduce two tools to reproduce, report and disseminate your analyses: R Markdown and the Open Science Framework.

\[~\]

banner