Skip to Main Content

Systematic Reviews, Scoping Reviews, and other Knowledge Syntheses

Screening process

There are many resources that explain how to go about screening. One place to start is the guide on Screening Studies created by the librarians at the University of Toronto.

These librarians have also compiled example screening templates available in Open Science Framework.

Covidence

McGill University Library has a subscription to Covidence, a useful tool for importing database records, removing duplicate records, screening, documenting critical appraisal/risk of bias and data extraction, and exporting data. Covidence also allows members of the McGill community to invite external reviewers to join their review team.

For access and support for Covidence, please consult: https://support.covidence.org/help/mcgill-university-library

Other screening resources

Software packages specifically designed for knowledge synthesis (e.g., Covidence) will typically include a record screening/study selection function. This allows more than one reviewer to independently screen the records without seeing other reviewers' decisions to include or exclude studies, and thus reduces bias.

Some software tools also include screening prioritization, which allows you to re-sort the records left to screen by relevance, based on past decisions to include or exclude records already screened. 

Partial list of software tools using machine learning/AI for screening prioritization:

The following systematic review addresses how text mining is being used in the screening process:

Critical appraisal and tools to identify risk of bias

Critical appraisal should involve an assessment of the risk of bias in the relevant studies and may also involve an assessment of how the studies were reported.

As librarians, we are generally not involved in the appraisal process, but we can provide guidance on finding critical appraisal tools if needed. They are generally specific to a given study design or research methodology. The following are some suggested tools but this list is not exhaustive and they may not have been validated.

 

Repositories or collections of tools:

  • CATevaluation: Assessment of critical appraisal tools
  • Quality Assessment and Risk of Bias Tool Repository
  • Critical appraisal tools (Joanna Briggs Institute) - Includes checklists for a wide range of studies including case reports, economic evaluations, incidence/prevalence, quasi-experimental studies, and text/opinion
  • CASP checklists (Critical Appraisal Skills Programme) - Includes checklists for wide range of studies. Useful for novice appraisers as checklists provide prompting questions and identify different elements of the study to assess
  • SIGN checklists (Healthcare Improvement Scotland) - Includes systematic reviews & meta-analyses, randomised controlled trials, cohort studies, case-control studies, diagnostic studies, economic studies
  • The Registry of Methods and Tools for Evidence-Informed Decision Making: Resources filtered to Appraisal (National Collaborating Centre for Methods and Tools)
  • Systematic Review Toolbox - Under Advanced Search, select Other Tools, then select Quality Checklist, i.e. Critical Appraisal

​​

Other tools for specific contexts not necessarily covered above:

Reporting guidelines:

  • Reporting guidelines: See the EQUATOR Network for guidelines relevant to specific study designs such as randomized controlled trials (CONSORT), systematic reviews (PRISMA)

Resources on analyzing/synthesizing findings

Resources on analyzing/synthesizing findings

Narrative synthesis of quantitative effect data

Thomson H, Campbell M. “Narrative synthesis” of quantitative effect data in Cochrane reviews: Current issues and ways forward [Internet]. Cochrane Learning Live Webinar Series 2020 Feb.  https://training.cochrane.org/resource/narrative-synthesis-quantitative-effect-data-cochrane-reviews-current-issues-and-ways. 

  • Part 1 helps navigate some of the confusion over the concepts of "narrative synthesis" or "qualitative review of (quantitative) data" versus the ambiguous use of the terms "narrative review" or "qualitative review"

Campbell, M., McKenzie, J. E., Sowden, A., Katikireddi, S. V., Brennan, S. E., Ellis, S., Hartmann-Boyce, J., Ryan, R., Shepperd, S., Thomas, J., Welch, V., & Thomson, H. (2020). Synthesis without meta-analysis (SWiM) in systematic reviews: Reporting guideline. BMJ, 368, l6890. doi:10.1136/bmj.l6890

Network meta-analysis

Hoaglin DC, Hawkins N, Jansen JP, Scott DA, Itzler R, Cappelleri JC, et al. Conducting indirect-treatment-comparison and network-meta-analysis studies: Report of the ISPOR Task Force on Indirect Treatment Comparisons Good Research Practices: Part 2. Value Health. 2011;14(4):429-37.

Contact us

Notice

Due to a large influx of requests, there may be an extended wait time for librarian support on knowledge syntheses.

 

Find a librarian in your subject area to help you with your knowledge synthesis project.

 

Or contact the librarians at the
Schulich Library of Physical Sciences, Life Sciences, and Engineering
schulich.library@mcgill.ca

Need help? Ask us!

McGill LibraryQuestions? Ask us!
Privacy notice