Published online Mar 31, 2019. doi: 10.13105/wjma.v7.i3.66
Peer-review started: March 19, 2019
First decision: March 19, 2019
Revised: March 23, 2019
Accepted: March 25, 2019
Article in press: March 26, 2019
Published online: March 31, 2019
Processing time: 12 Days and 14.8 Hours
Irreproducibility of research causes a major concern in academia. This concern affects all study designs regardless of scientific fields. Without testing the reproducibility and replicability it is almost impossible to repeat the research and to gain the same or similar results. In addition, irreproducibility limits the translation of research findings into practice where the same results are expected. To find the solutions, the Interacademy Partnership for Health gathered academics from established networks of science, medicine and engineering around a table to introduce seven strategies that can enhance the reproducibility: pre-registration, open methods, open data, collaboration, automation, reporting guidelines, and post-publication reviews. The current editorial discusses the generalisability and practicality of these strategies to systematic reviews and claims that systematic reviews have even a greater potential than other research designs to lead the movement toward the reproducibility of research. Moreover, I discuss the potential of reproducibility, on the other hand, to upgrade the systematic review from review to research. Furthermore, there are references to the successful and ongoing practices from collaborative efforts around the world to encourage the systematic reviewers, the journal editors and publishers, the organizations linked to evidence synthesis, and the funders and policy makers to facilitate this movement and to gain the public trust in research.
Core tip: Reproducibility increases the practicality of the research findings and gains the public trust in research. The ongoing developments in automation of systematic reviews, availability of pre-registration platform, dealing more with secondary data or anonymized primary data, the collaboration culture among the organizations who produce systematic reviews, and finally having an update step that mandates replicability are all reasons that systematic reviews have the potential to lead the movement toward the reproducibility among the other research designs. Meanwhile, reproducibility can help the systematic reviews to be considered as research design rather than literature review.
- Citation: Shokraneh F. Reproducibility and replicability of systematic reviews. World J Meta-Anal 2019; 7(3): 66-71
- URL: https://www.wjgnet.com/2308-3840/full/v7/i3/66.htm
- DOI: https://dx.doi.org/10.13105/wjma.v7.i3.66
Systematic reviews are at high levels of evidence hierarchy in clinical practice[1]. People who are involved in healthcare systems usually use systematic reviews in research, policy, and practice[3] trusting the reproducibility of the results when implemented[2]. At the same time, some criticize that the systematic reviews are literature reviews not research[4,5]. To utilize the systematic reviews in practice and to call them research studies, we need reproducibility testing; and to ensure that a systematic review is reproducible it is important to design, to record and to report systematic reviews in a transparent and reproducible way and to prioritize and fund reproducible reviews[6]. Some suggest that a team independent from the original team can repeat the systematic reviews to ensure the reproducibility[7]. Since conducting systematic reviews is already time-consuming[8] and resource-rating[9], it is arguable how adding more steps such as reproducibility test that requires more time and resources could reduce waste and increase value.
In context of this paper, reproducibility is re-conducting the same study, using the same methods and data by a different researcher or team and the replicability is re-doing the same study to gather new data or recollect the data[10].
To provide solutions for irreproducibility, the Interacademy Partnership for Health introduced seven strategy to enhance the reproducibility practice in science[11]. This editorial discusses the progress with using these strategies in systematic reviewing process and calls for collaboration in all levels of system to enhance the reproducibility of systematic reviews.
Currently, prospective registration of systematic review protocols in PROSPERO, a register of systematic review protocols, is recommended[12]. Compared to clinical trials with at least 17 registries[13] there is only one register for systematic reviews; however, unlike clinical trials, it is not yet mandatory to register systematic reviews prospectively[14]. Today, PROSPERO covers only 30000 records of conducted, ongoing, awaiting, and abandoned review family (less than a third of 100000 systematic reviews in MEDLINE)[15], it does not support the quality control mechanism[16], and it lacks a rigor follow-up procedure for abandoned systematic reviews[17]. To look at the bright side, there is an association between registration of the published reviews and the quality of these reviews[18]. Allocating more resources to this register, training and encouraging the systematic reviewers to register their reviews, and making the pre-registration a standard for bias control will push the reproducibility theory toward practice.
Researchers should share search strategies for all databases[19] and analytical codes for meta-analysis[20] as part of the methods of systematic reviews. Following to the prospective registration and publication of the protocol, the researchers and the research audiences could assess the reproducibility and detect if any variation from the protocol could have important implementation messages for research, policy and practice[12]. This practice is not just to test the reproducibility but also to replicate another analysis or a new update for the systematic review. None of these are possible without access to all search strategies and statistical codes for meta-analysis.
Search results (excluding copyrighted abstract and database-specific meta-data) in Research Information Systems (RIS) format[21] and extracted data and meta-data from the studies are the main resulting dataset during the systematic reviewing[22-24]. Access to open data from systematic reviews makes it possible to re-screen the search results, to de-duplicate the update searches, to re-run the meta-analyses, and to test the reproducibility of searching, screening, and data analysis steps. Besides, these data will have more value if they have been shared beside their associated meta-data following FAIR guidelines (findable, accessible, inter-operable, and reusable)[25]. There have already been calls for sharing the data from systematic reviews but there is no policy or action in place[22-24]. Sharing the data from all systematic reviews can lead into data-driven innovations with potential for knowledge discovery and saving the waste of resources and lives.
Collaboration among research teams in small or large scale increases the chance for more expertise input and enhances the error detection and fixation practice[26,27]. Sharing the data among collaborators or interested research groups could bring together the data and resources for re-analyzing the same data[20] or innovations[23] that are impossible without such collaboration. It is not good practice to hold the data for years hoping to receive funding or innovating while sharing could result in faster innovation, receiving credits or collaboration in grant applications[26,27]. It also raises the morality and mortality question that is it ethical to hold the data when sharing it could lead to decisions that can save public resources and lives, and reduce the waste. The data extracted from other primary research for systematic reviews cannot be owned by the systematic reviewers or organizations that produce the systematic reviews.
International Collaboration for the Automation of Systematic Reviews produces annual report of progress for automation of systematic reviews[28-30]. This collaboration seems to understand well that the automation is a key for reproducibility and follows Vienna Principles that also emphasize on the replicability of automation activities and sharing the program codes for wider use by the community[28]. The value of the automation becomes more obvious looking at reports of human errors in systematic reviews in searching[31] and data extraction steps[32]. The service provided by machine can speed the process and reduce the waste caused by human errors through standardization of practices such as statistical analysis or systematic review write-up steps[30,33]. Despite all technological development, systematic reviewers have underused the automation tools[34]. Currently, Systematic Review Data Repository[35], EPPI-Reviewer[36], Study-Based Registers[37], and Evidence Pipeline as semi-automated systems have the potential to evolve into automated systems for systematic reviews.
Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA)[38] now celebrates a decade of being used in reporting step of systematic reviews and major journals enforce the systematic reviewers to follow the PRISMA family guidelines in reporting. Such reporting guidelines are helping researchers to report certain items for publications and it is not their primary purpose to advocate the reproducibility[6]. There is an update of PRISMA 2019 in progress that will include more items and some these items can maximize the reproducibility practice[6].
Pre-publication peer-reviews are limited to a few people while post-publication reviews give chance for wider audience to appraise and comment on some aspect of the research. Post-publication activities take many forms including letter to editor, commentary, blogs, and other social media posts[26]. These reviews are separate and independent from the original research and the only connection is through a link or citation. As a result, it is hardly possible to find all these reviews integrated in one place. This problem expands when there are retractions to the original systematic reviews or the findings are published in salami of papers. Such post-publication reviews, however, are encouraged in particular for systematic reviews because they can be taken into account in the next updates of the current systematic review. Having an update step in development of systematic reviews, unlike other published literature, is a unique advantage of systematic reviews allowing the reviewers to correct their mistakes and errors or to consider addition of new data or aspect to the review.
As an addition to these strategies, it is also important not to overlook the process of the systematic reviewing and its connection to reproducibility. The routine practice in systematic reviews is to involve at least two researchers in screening and data extraction steps to reduce human errors[32,39] through double-checking of the decision and to reach an agreement. Such agreement sometimes requires a discussion between two reviewers or inviting the comments from another usually senior researcher. It means the decision on eligibility of studies or accuracy of data extraction is being replicated twice or three times. Since this process itself is replicating part of the review and has value for improving the reproducibility, some of the automation and semi-automation systems allow the researchers to document the process of double- and triple-checking within the system but for transparency purposes, this needs to be shared as well. In other words, the process should be documented and shared publicly.
Systematic reviews have the great potential to lead the reproducibility practice among the rest of study designs in scientific fields because: A. Having an update step allows the systematic reviews to be corrected and helps in advancing ‘living systematic reviews’; B. Making a unique progress in automation of systematic reviews helps researchers to save time and resources in every step of systematic reviewing; C. Provision of protocol and methods facilitates the replication of systematic review in update step. To make such role model, the organizations whose main activity includes producing systematic reviews should come together and collaborate on developing policies on reproducibility and sharing the data and methods from within the systematic reviews. On the other hand, these organizations have their own journal platforms and the journal publishers themselves need to engage in this policy development as well. To avoid a meta-waste, Cochrane Database of Systematic Reviews, Systematic Reviews journal, World Journal of Meta-Analysis, JBI Database of Systematic Reviews and Implementation Reports, and Environmental Evidence now have a great opportunity to come together and set the bars on reproducibility of systematic reviews.
Manuscript source: Invited manuscript
Specialty type: Medicine, research and experimental
Country of origin: United Kingdom
Peer-review report classification
Grade A (Excellent): 0
Grade B (Very good): 0
Grade C (Good): 0
Grade D (Fair): 0
Grade E (Poor): 0
P-Reviewer: A S-Editor: Dou Y L-Editor: A E-Editor: Wu YXJ
1. | Centre for Evidence-based Medicine. Oxford Centre for Evidence-based Medicine – Levels of Evidence, 2009. Available from: https://www.cebm.net/2009/06/oxford-centre-evidence-based-medicine-levels-evidence-march-2009/. [Cited in This Article: ] |
2. | Ahmad N, Boutron I, Dechartres A, Durieux P, Ravaud P. Applicability and generalisability of the results of systematic reviews to public health practice and policy: a systematic review. Trials. 2010;11:20. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 31] [Cited by in F6Publishing: 33] [Article Influence: 2.4] [Reference Citation Analysis (0)] |
3. | Chalmers I, Fox DM. Increasing the Incidence and Influence of Systematic Reviews on Health Policy and Practice. Am J Public Health. 2016;106:11-13. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 19] [Cited by in F6Publishing: 26] [Article Influence: 2.9] [Reference Citation Analysis (0)] |
4. | Campbell A. A Quick Guide to Research Methods. ANZJFT. 2004;25:165-167. [DOI] [Cited in This Article: ] |
5. | Petticrew M. Systematic reviews from astronomy to zoology: myths and misconceptions. BMJ. 2001;322:98-101. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 324] [Cited by in F6Publishing: 186] [Article Influence: 8.1] [Reference Citation Analysis (0)] |
6. | Page MJ, Altman DG, Shamseer L, McKenzie JE, Ahmadzai N, Wolfe D, Yazdi F, Catalá-López F, Tricco AC, Moher D. Reproducible research practices are underused in systematic reviews of biomedical interventions. J Clin Epidemiol. 2018;94:8-18. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 76] [Cited by in F6Publishing: 94] [Article Influence: 15.7] [Reference Citation Analysis (0)] |
7. | Faggion CM. Should a systematic review be tested for reproducibility before its publication? J Clin Epidemiol. 2019;. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 3] [Cited by in F6Publishing: 4] [Article Influence: 0.8] [Reference Citation Analysis (0)] |
8. | Sampson M, Shojania KG, Garritty C, Horsley T, Ocampo M, Moher D. Systematic reviews can be produced and published faster. J Clin Epidemiol. 2008;61:531-536. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 54] [Cited by in F6Publishing: 51] [Article Influence: 3.2] [Reference Citation Analysis (0)] |
9. | Borah R, Brown AW, Capers PL, Kaiser KA. Analysis of the time and workers needed to conduct systematic reviews of medical interventions using data from the PROSPERO registry. BMJ Open. 2017;7:e012545. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 316] [Cited by in F6Publishing: 259] [Article Influence: 37.0] [Reference Citation Analysis (0)] |
10. | Patil P, Peng RD, Leek JT. A statistical definition for reproducibility and replicability. bioRxiv. 2016;066803. [DOI] [Cited in This Article: ] |
11. | The Interacademy Partnership for Health. A call for action to improve the reproducibility of biomedical research. Available from: https://research-integrity.uq.edu.au/files/4502/IAPforHealth-statement-Sep2016.pdf. [Cited in This Article: ] |
12. | Stewart L, Moher D, Shekelle P. Why prospective registration of systematic reviews makes sense. Syst Rev. 2012;1:7. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 177] [Cited by in F6Publishing: 217] [Article Influence: 18.1] [Reference Citation Analysis (0)] |
13. | World Health Organization. International Clinical Trial Registry Platform: Data Providers, 2019. Available from: https://www.who.int/ictrp/search/data_providers/en/. [Cited in This Article: ] |
14. | Booth A, Stewart L. Trusting researchers to use open trial registers such as PROSPERO responsibly. BMJ. 2013;347:f5870. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 7] [Cited by in F6Publishing: 7] [Article Influence: 0.6] [Reference Citation Analysis (0)] |
15. | Page MJ, Shamseer L, Tricco AC. Registration of systematic reviews in PROSPERO: 30,000 records and counting. Syst Rev. 2018;7:32. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 157] [Cited by in F6Publishing: 261] [Article Influence: 43.5] [Reference Citation Analysis (0)] |
16. | Booth A, Clarke M, Dooley G, Ghersi D, Moher D, Petticrew M, Stewart L. The nuts and bolts of PROSPERO: an international prospective register of systematic reviews. Syst Rev. 2012;1:2. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 638] [Cited by in F6Publishing: 833] [Article Influence: 69.4] [Reference Citation Analysis (0)] |
17. | Andrade R, Pereira R, Weir A, Ardern CL, Espregueira-Mendes J. Zombie reviews taking over the PROSPERO systematic review registry. It's time to fight back! Br J Sports Med. 2017;. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 12] [Cited by in F6Publishing: 12] [Article Influence: 1.7] [Reference Citation Analysis (0)] |
18. | Sideri S, Papageorgiou SN, Eliades T. Registration in the international prospective register of systematic reviews (PROSPERO) of systematic review protocols was associated with increased review quality. J Clin Epidemiol. 2018;100:103-110. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 190] [Cited by in F6Publishing: 196] [Article Influence: 32.7] [Reference Citation Analysis (0)] |
19. | Koffel JB, Rethlefsen ML. Reproducibility of Search Strategies Is Poor in Systematic Reviews Published in High-Impact Pediatrics, Cardiology and Surgery Journals: A Cross-Sectional Study. PLoS One. 2016;11:e0163309. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 40] [Cited by in F6Publishing: 54] [Article Influence: 6.8] [Reference Citation Analysis (0)] |
20. | Goldacre B. All BMJ research papers should share their analytic code. BMJ. 2016;352:i886. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 3] [Cited by in F6Publishing: 4] [Article Influence: 0.5] [Reference Citation Analysis (0)] |
21. | Shokraneh F. Reproducible and Replicable Search for Research Methods in Systematic Reviews. Search Solutions. 2018;2018 Nov 26; London, UK. [Cited in This Article: ] |
22. | Haddaway NR. Open Synthesis: on the need for evidence synthesis to embrace Open Science. Environmental Evidence. 2018;7:26. [DOI] [Cited in This Article: ] |
23. | Shokraneh F, Adams CE, Clarke M, Amato L, Bastian H, Beller E, Brassey J, Buchbinder R, Davoli M, Del Mar C, Glasziou P, Gluud C, Heneghan C, Hoffmann T, Ioannidis JP, Jayaram M, Kwong J, Moher D, Ota E, Sheriff RS, Vale L, Goldacre B. Why Cochrane should prioritise sharing data. BMJ. 2018;362:k3229. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 13] [Cited by in F6Publishing: 14] [Article Influence: 2.3] [Reference Citation Analysis (0)] |
24. | Wolfenden L, Grimshaw J, Williams CM, Yoong SL. Time to consider sharing data extracted from trials included in systematic reviews. Syst Rev. 2016;5:185. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 20] [Cited by in F6Publishing: 17] [Article Influence: 2.1] [Reference Citation Analysis (0)] |
25. | Wilkinson MD, Dumontier M, Aalbersberg IJ, Appleton G, Axton M, Baak A, Blomberg N, Boiten JW, da Silva Santos LB, Bourne PE, Bouwman J, Brookes AJ, Clark T, Crosas M, Dillo I, Dumon O, Edmunds S, Evelo CT, Finkers R, Gonzalez-Beltran A, Gray AJ, Groth P, Goble C, Grethe JS, Heringa J, 't Hoen PA, Hooft R, Kuhn T, Kok R, Kok J, Lusher SJ, Martone ME, Mons A, Packer AL, Persson B, Rocca-Serra P, Roos M, van Schaik R, Sansone SA, Schultes E, Sengstag T, Slater T, Strawn G, Swertz MA, Thompson M, van der Lei J, van Mulligen E, Velterop J, Waagmeester A, Wittenburg P, Wolstencroft K, Zhao J, Mons B. The FAIR Guiding Principles for scientific data management and stewardship. Sci Data. 2016;3:160018. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 5002] [Cited by in F6Publishing: 5363] [Article Influence: 670.4] [Reference Citation Analysis (0)] |
26. | Academy of Medical Sciences. Reproducibility and reliability of biomedical research: improving research practice, 2015. Available from: https://acmedsci.ac.uk/policy/policy-projects/reproducibility-and-reliability-of-biomedical-research. [Cited in This Article: ] |
27. | Academy of Medical Sciences. Improving research reproducibility and reliability: progress update from symposium sponsors, 2016. Available from: https://mrc.ukri.org/documents/pdf/reproducibility-update-from-sponsors/. [Cited in This Article: ] |
28. | Beller E, Clark J, Tsafnat G, Adams C, Diehl H, Lund H, Ouzzani M, Thayer K, Thomas J, Turner T, Xia J, Robinson K, Glasziou P; founding members of the ICASR group. Making progress with the automation of systematic reviews: principles of the International Collaboration for the Automation of Systematic Reviews (ICASR). Syst Rev. 2018;7:77. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 67] [Cited by in F6Publishing: 63] [Article Influence: 10.5] [Reference Citation Analysis (0)] |
29. | O'Connor AM, Tsafnat G, Gilbert SB, Thayer KA, Wolfe MS. Moving toward the automation of the systematic review process: a summary of discussions at the second meeting of International Collaboration for the Automation of Systematic Reviews (ICASR). Syst Rev. 2018;7:3. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 36] [Cited by in F6Publishing: 31] [Article Influence: 5.2] [Reference Citation Analysis (0)] |
30. | O'Connor AM, Tsafnat G, Gilbert SB, Thayer KA, Shemilt I, Thomas J, Glasziou P, Wolfe MS. Still moving toward automation of the systematic review process: a summary of discussions at the third meeting of the International Collaboration for Automation of Systematic Reviews (ICASR). Syst Rev. 2019;8:57. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 29] [Cited by in F6Publishing: 72] [Article Influence: 14.4] [Reference Citation Analysis (0)] |
31. | Sampson M, McGowan J. Errors in search strategies were identified by type and frequency. J Clin Epidemiol. 2006;59:1057-1063. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 70] [Cited by in F6Publishing: 73] [Article Influence: 4.1] [Reference Citation Analysis (0)] |
32. | Buscemi N, Hartling L, Vandermeer B, Tjosvold L, Klassen TP. Single data extraction generated more errors than double data extraction in systematic reviews. J Clin Epidemiol. 2006;59:697-703. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 247] [Cited by in F6Publishing: 274] [Article Influence: 15.2] [Reference Citation Analysis (0)] |
33. | Tsafnat G, Glasziou P, Choong MK, Dunn A, Galgani F, Coiera E. Systematic review automation technologies. Syst Rev. 2014;3:74. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 242] [Cited by in F6Publishing: 206] [Article Influence: 20.6] [Reference Citation Analysis (0)] |
34. | van Altena AJ, Spijker R, Olabarriaga SD. Usage of automation tools in systematic reviews. Res Synth Methods. 2019;10:72-82. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 22] [Cited by in F6Publishing: 21] [Article Influence: 4.2] [Reference Citation Analysis (0)] |
35. | Li T, Vedula SS, Hadar N, Parkin C, Lau J, Dickersin K. Innovations in data collection, management, and archiving for systematic reviews. Ann Intern Med. 2015;162:287-294. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 56] [Cited by in F6Publishing: 60] [Article Influence: 6.7] [Reference Citation Analysis (0)] |
36. | Park SE, Thomas J. Evidence synthesis software. BMJ Evid Based Med. 2018;23:140-141. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 7] [Cited by in F6Publishing: 7] [Article Influence: 1.2] [Reference Citation Analysis (0)] |
37. | Shokraneh F, Adams CE. Study-based registers of randomized controlled trials: Starting a systematic review with data extraction or meta-analysis. Bioimpacts. 2017;7:209-217. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 67] [Cited by in F6Publishing: 73] [Article Influence: 10.4] [Reference Citation Analysis (0)] |
38. | Moher D, Liberati A, Tetzlaff J, Altman DG; PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med. 2009;6:e1000097. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 47017] [Cited by in F6Publishing: 45383] [Article Influence: 3025.5] [Reference Citation Analysis (0)] |
39. | Carroll C, Scope A, Kaltenthaler E. A case study of binary outcome data extraction across three systematic reviews of hip arthroplasty: errors and differences of selection. BMC Res Notes. 2013;6:539. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 10] [Cited by in F6Publishing: 9] [Article Influence: 0.8] [Reference Citation Analysis (0)] |