Knowledge Syntheses - Systematic Reviews, Scoping Reviews and More

Introduction

There are generally four stages of a knowledge synthesis, each of which includes several tasks.  Most of these tasks apply to all types of syntheses, but not all. For example, scoping reviews do not necessarily require quality assessment of studies. 

Included on this page are:

  • descriptions of the four stages, outlining the tasks typically carried out when doing a review. More in-depth and review specific guides and training resources are linked throughout these descriptions as well as in the Training Resources section of this guide. 
  • timeline tables
  • methodology articles, books, websites

Stage 1 Preparation

Most types of knowledge syntheses cannot be conducted by one person. Ideally, you need a team of at least three people which includes:

  • Project lead/manager
  • Subject experts with methodological expertise
  • Two people to screen the results independently
  • One person to be a a tie breaker to make decisions if there is disagreement about whether a study meets the inclusion criteria
  • A librarian trained in knowledge synthesis searching
  • Someone with expertise in statistical analysis if doing a meta-analysis

Read these helpful Tips for a Successful Review Team from Northeastern University Library.

Developing clear and focused questions is crucial in a knowledge synthesis for several reasons:

  • Well-defined questions help structure the entire review process, from literature search to data extraction and analysis. They ensure that the review remains focused and relevant
  • Clear questions help in setting predefined criteria for including or excluding studies, which minimizes selection bias and ensures that the review is comprehensive and unbiased
  • Specific questions make the review process transparent and reproducible. Other researchers can follow the same steps and verify the findings
  • Focused questions provide precise answers that can inform clinical practice, policy-making, and further research. They help in synthesizing evidence that is directly applicable to real-world problems
  • By narrowing down the scope, well-developed questions help in efficiently using time and resources, avoiding unnecessary work on irrelevant studies.

Consider using one of the frameworks listed in the table below to help guide question development. (Adapted from: Foster, M. & Jewell, S. (Eds). (2017). Assembling the pieces of a systematic review: Guide for librarians. Medical Library Association, Lanham: Rowman & Littlefield. p. 38, Table 3.3.).  

Framework Stands for

Original Source

Examples/templates (where available)

Type of Question/
Originating Discipline

BeHEMNoTh Be: behavior of interest
H: health contest (service/policy/intervention)
E: exclusions
MoTh: models or theories

Booth, A., & Carroll, C. (2015). Systematic searching for theory to inform systematic reviews: Is it feasible? Is it desirable? Health Information and Libraries Journal, 32(3), 220–235. https://doi.org/10.1111/hir.12108

Examples/templates:

Questions about theories
CHIP Context
How
Issues
Population
Shaw, R. (2010). Conducting literature reviews. In M. A. Forester (Ed.), Doing Qualitative Research in Psychology: A Practical Guide (pp. 39-52). London, Sage Psychology, qualitative
CIMO Context
Intervention
Mechanisms
Outcomes
Denyer, D., & Tranfield, D. (2009). Producing a systematic review. In D. A. Buchanan & A. Bryman (Eds.), The Sage handbook of organizational research methods (pp. 671-689). Thousand Oaks, CA: Sage Publications Ltd. Management, business, administration
CLIP

Client group
Intervention
Mechanisms
Outcomes

Wildridge, V., & Bell, L. (2002). How CLIP became ECLIPSE: A mnemonic to assist in searching for health policy/management informationHealth Information & Libraries Journal, 19(2), 113–115. https://doi.org/10.1046/j.1471-1842.2002.00378.x Management, business, administration
CoCoPop

Condition
Context
Population

Munn Z, Moola S, Lisy K, Riitano D, Tufanaru C. Methodological guidance for systematic reviews of observational epidemiological studies reporting prevalence and cumulative incidence data. Int J Evid Based Healthc. 2015 Sep;13(3):147-53. doi: 10.1097/XEB.0000000000000054. PMID: 26317388.

Examples/templates:

Prevelance, incidence
COPES Client-Oriented
Practical
Evidence
Search
Gibbs, L. (2003). Evidence-based practice for the helping professions: A practical guide with integrated multimedia. Pacific Grove, CA: Brooks/Cole-Thomson Learning. Social work, health care, nursing
ECLIPSe Expectation
Client
Location
Impact
Professionals
Service

Wildridge, V., & Bell, L. (2002). How CLIP became ECLIPSE: A mnemonic to assist in searching for health policy/management informationHealth Information & Libraries Journal, 19(2), 113–115. https://doi.org/10.1046/j.1471-1842.2002.00378.x

Examples/templates:

Management, services, policy, social care
PEO Population
Exposure
Outcome

Khan, K., Kunz, R., Kleijnen, J., & Antes, G. (2011). Systematic reviews to support evidence-based medicine : how to review and apply findings of healthcare research (Second edition.). CRC Press.

Examples/templates:

Qualitative
PECODR Patient/population/problem
Exposure
Comparison
Outcome
Duration
Results
Dawes, M., Pluye, P., Shea, L., Grad, R., Greenberg, A., & Nie, J.-Y. (2007). The identification of clinically important elements within medical journal abstracts: Patient_Population_Problem, Exposure_Intervention, Comparison, Outcome, Duration and Results (PECODR). Journal of Innovation in Health Informatics, 15(1), 9–16.
 
Medicine
PerSPECTiF Perspective
Setting
Phenomenon of interest/Problem
Environment
Comparison (optional)
Time/Timing
Findings

Booth, A., Noyes, J., Flemming, K., Moore, G., Tunçalp, Ö., & Shakibazadeh, E. (2019). Formulating questions to explore complex interventions within qualitative evidence synthesisBMJ Global Health, 4(Suppl 1).

Examples/templates:

Qualitative research
PESICO Person
Environments
Stakeholders
Intervention
Comparison
Outcome
Schlosser, R. W., & O'Neil-Pirozzi, T. (2006). Problem formulation in evidence-based practice and systematic reviewsContemporary Issues in Communication Sciences and Disorders, 33, 5-10. Augmentative and alternative communication
PICO Patient
Intervention
Comparison
Outcome

Richardson, W. S., Wilson, M. C., Nishikawa, J., & Hayward, R. S. (1995). The well-built clinical question: A key to evidence-based decisions. ACP journal club, 123(3), A12-A12.

Examples/Templates:

  • PICO (EBP Nursing Research Guide, University of Calgary)
  • PICO examples (UNC University Libraries)
Clinical medicine 
PICO+

Patient
Intervention
Comparison
Outcome

+context, patient values, and preferences

Bennett, S., & Bennett, J. W. (2000). The process of evidence‐based practice in occupational therapy: Informing clinical decisionsAustralian Occupational Therapy Journal, 47(4), 171-180. Occupational therapy
PICOC

Patient
Intervention
Comparison
Outcome

Study Type

Petticrew, M., & Roberts, H. (2006). Systematic reviews in the social sciences: A practical guide. Malden, MA: Blackwell Publishers.  Social Sciences
PICOS

Patient
Intervention
Comparison
Outcome

Study Type

Moher, D., Liberati, A., Tetzlaff, J., Altman, D. G., & Prisma Group. (2009). Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS medicine, 6(7), e1000097. Medicine
PICOT

Patient
Intervention
Comparison
Outcome

Time

Richardson, W. S., Wilson, M. C., Nishikawa, J., & Hayward, R. S. (1995). The well-built clinical question: A key to evidence-based decisions. ACP journal club, 123(3), A12-A12.

Examples/templates:

Education, health care
PICO specific to diagnostic tests Patient/participants/population
Index tests
Comparator/reference tests
Outcome
Kim, K. W., Lee, J., Choi, S. H., Huh, J., & Park, S. H. (2015). Systematic review and meta-analysis of studies evaluating diagnostic test accuracy: A practical review for clinical researchers - Part I. General guidance and tipsKorean Journal of Radiology, 16(6), 1175-1187. Diagnostic questions
ProPheT Problem
Phenomenon of interest
Time

Booth, A., Noyes, J., Flemming, K., Gerhardus, A., Wahlster, P., van der Wilt, G. J., ... & Rehfuess, E. (2016). Guidance on choosing qualitative evidence synthesis methods for use in health technology assessments of complex interventions. [Technical Report]. https://doi.org/10.13140/RG.2.1.2318.0562

-----

Booth, A., Sutton, A., & Papaioannou, D. (2016). Systematic approaches to a successful literature review (2. ed.). London: Sage.

Social sciences, qualitative, library science
SPICE Setting
Perspective
Interest
Comparison
Evaluation

Booth, A. (2006). Clear and present questions: formulating questions for evidence based practiceLibrary Hi Tech, 24(3), 355-368.

Examples/templates:

Library and information sciences
SPIDER Sample
Phenomenon of interest
Design
Evaluation
Research type

Cooke, A., Smith, D., & Booth, A. (2012). Beyond PICO: The SPIDER tool for qualitative evidence synthesis. Qualitative health research, 22(10), 1435-1443.

Examples/templates:

Health, qualitative research

 

Once you have your question, you’ll need to look for existing reviews and ongoing protocols on your topic to make sure you’re not duplicating work. This may also help you make any necessary changes to your question to address other gaps in the research.

While searching for reviews you'll also be able to assess the volume of potentially relevant studies as well as identify key studies. Key studies can be used as 'seed articles' providing a starting point for developing comprehensive search strategies. They can also serve as a benchmark for assessing the success of your search strategy (i.e., seminal articles should be in the results of your search), as well as the quality and relevance of other studies included in the review. 

Searching for protocols
Searching for published reviews

Some published reviews can be found in many of the resources above, but not all reviews are published with Cochrane, JBI or the Campbell Collaboration.  You will also need to search in databases such as CINAHL, ERIC, Medline, PsycINFO, and the Web of Science. Google scholar may also be useful when searching for published reviews. 

Database search tips:
  • Filter the results of a topic search with the a database filter (e.g., publication type or methodology)
    • CINAHL, Medline and PubMed all have the publication type filter "systematic review"
    • PsycINFO has a methodology filter "systematic review"
  • To your topic search, add a keyword search for the type of review you're looking for (e.g., scoping review, systematic review, integrative review).
  • If you're seeking existing systematic reviews, try a pre-established search filter. Here are some examples: 

There are many types of reviews, each suited to a particular purpose. The following need to be taken into consideration when deciding upon a review type1:

  • What is the nature of your question?
    • Is the question best answered by quantitative studies (systematic review; meta-analysis) or qualitative studies (meta-synthesis) or both (integrative review)?
    • Are you more interested in describing the nature and extent of the research in a particular area rather than answering a specific question (scoping review)?
  • Is the review to be undertaken by a team or will you be conducting this on your own?
    • To minimize bias, a true systematic review is supposed to be undertaken by a review team, with screening and appraisal conducted independently by at least two team members with a third member available to settle disagreements.
    • A meta-analysis requires the pooling and statistical analysis of study results.  If you do not have the expertise in statistical analysis, will you be on a team that includes a statistician?
    • Reviews labeled 'systematic reviews' with single authors do appear in the form of Masters theses, PhD dissertations and capping projects, but a review labeled as a systematic review authored by a single person is not likely to be published.  
      • It is still possible to have a non-systematic review (e.g. critical reviews, narrative reviews)  published in a reputable journal.  Might it be better to avoid labeling your review as a systematic review if it does not meet the criteria of a true systematic review? 
  • How much literature do you expect to retrieve? 
    • Will you have the time to screen your search results and summarize included studies in the time frame allotted if you undertake one of the more rigorous types of reviews?
  • If you are a graduate student, what are the expectations of your supervisor?
    • Are they expecting you to do a specific type of review?
    • It is a good idea to discuss with your supervisor exactly what they expect of the review, e.g. do they expect you to use a specific guideline (PRISMA), will they accept a 'mini' version of one of the more rigorous reviews if time and resources are limited?

The following resources may help you decide which type of review best suits your research question(s):

Decision tools:
Articles:
  • Cook, C. N., Nichols, S. J., Webb, J. A., Fuller, R. A., & Richards, R. M. (2017). Simplifying the selection of evidence synthesis methods to inform environmental decisions: A guide for decision makers and scientists. Biological Conservation, 213, 135–145. doi.org/10.1016/j.biocon.2017.07.004
  • Gough, D., Thomas, J., & Oliver, S. (2012). Clarifying differences between review designs and methods. Systematic Reviews, 1(1), 28. doi.org/10.1186/2046-4053-1-28
  • Grant, M. J., & Booth, A. (2009). A typology of reviews: An analysis of 14 review types and associated methodologies. Health Information & Libraries Journal, 26(2), 91–108. doi.org/10.1111/j.1471-1842.2009.00848.x
  • Munn, Z., Peters, M. D. J., Stern, C., Tufanaru, C., McArthur, A., & Aromataris, E. (2018). Systematic review or scoping review? Guidance for authors when choosing between a systematic or scoping review approach. Bmc Medical Research Methodology, 18, 143. https://doi.org/10.1186/s12874-018-0611-x
  • Munn, Z., Stern, C., Aromataris, E., Lockwood, C., & Jordan, Z. (2018). What kind of systematic review should I conduct? A proposed typology and guidance for systematic reviewers in the medical and health sciences. BMC Medical Research Methodology, 18(1), 5. https://doi.org/10.1186/s12874-017-0468-4
  • Noble, H., & Smith, J. (2018). Reviewing the literature: Choosing a review design. Evidence-Based Nursing, 21(2), 39–41. https://doi.org/10.1136/eb-2018-102895
  • Schick-Makaroff, K., MacDonald, M., Plummer, M., Burgess, J., & Neander, W. (2016). What Synthesis Methodology Should I Use? A Review and Analysis of Approaches to Research Synthesis. AIMS Public Health, 3(1), 172–215. doi.org/10.3934/publichealth.2016.1.172
  • Sutton, A., Clowes, M., Preston, L., & Booth, A. (2019). Meeting the review family: Exploring review types and associated information retrieval requirements. Health Information & Libraries Journal, 36(3), 202–222. https://doi.org/10.1111/hir.12276
  • Tricco, A. C., Tetzlaff, J., & Moher, D. (2011). The art and science of knowledge synthesis. Journal of Clinical Epidemiology, 64(1), 11–20. doi.org/10.1016/j.jclinepi.2009.11.007
  • Wang, J. (2019). Demystifying Literature Reviews: What I Have Learned From an Expert? Human Resource Development Review, 18(1), 3–15. doi.org/10.1177/1534484319828857
  • Whittemore, R., Chao, A., Jang, M., Minges, K. E., & Park, C. (2014). Methods for knowledge synthesis: An overview. Heart & Lung, 43(5), 453–461. doi.org/10.1016/j.hrtlng.2014.05.014

References:

1. University of Alberta Library. (2024). Preparing for a comprehensive literature review. https://guides.library.ualberta.ca/reviewprep

Purpose

Regardless of the type of review, a protocol is highly recommended for these reasons:

  • can serve as a road map for your review;
  • specifies the objectives, methods, and outcomes of primary interest of the review;
  • promotes transparency of methods; and
  • allows your peers to review how you will extract information to quantitavely or qualitatively summarize your data.
Structure

A protocol can be formal, informal, published, registered, or presented at conferences. Here are some typical components of a protocol: 

  • Information about the research team
  • Support, funding
  • Background - topic, aims of the review, type of review, rationale
  • Research questions and question framework, if you're using one (e.g. PICO)
  • Standards or guidelines being used (e.g. Prisma, Cochrane, JBI)
  • Sources and search strategies
    • Databases that will be searched
    • Foundational search strategy
    • Grey literature, if including (e.g., targeted website searches, Google searches)
    • Hand searches, if including (e.g., citation tracking, searching select journal TOCs, searching select authors)
    • Expert consultation, if including
  • Eligibility Criteria (inclusion and exclusion), for example:
    • research methodology
    • population groups
    • geography
    • publication dates
    • language
    • publication type (e.g., peer reviewed articles, dissertations, reports)
    • List of common inclusion criteria - University of Melbourne
  • Screening methods
    • number of reviewers
    • inter-rater reliability
    • tools  (e.g., Covidence, Zotero)
  • Quality assessment method and tools, if including 
  • Extraction method and tools
  • Analysis (qualitative, quantitative, thematic, meta-analysis, mixed method)
Protocol templates and guidelines: 
Where to publish your protocol:
Protocol Examples
  • Mapping Review

Protocol: Bottrill, M., Cheng, S., Garside, R., Wongbusarakum, S., Roe, D., Holland, M. B., Edmond, J., & Turner, W. R. (2014). What are the impacts of nature conservation interventions on human well-being: A systematic map protocol. Environmental Evidence, 3(1), 16. https://doi.org/10.1186/2047-2382-3-16

Published article: McKinnon, M. C., Cheng, S. H., Dupre, S., Edmond, J., Garside, R., Glew, L., Holland, M. B., Levine, E., Masuda, Y. J., Miller, D. C., Oliveira, I., Revenaz, J., Roe, D., Shamer, S., Wilkie, D., Wongbusarakum, S., & Woodhouse, E. (2016). What are the effects of nature conservation on human well-being? A systematic map of empirical evidence from developing countries. Environmental Evidence, 5(1), 8. https://doi.org/10.1186/s13750-016-0058-7

  • Systematic Review 

Protocol: Michael Cusimano, Melissa Carpino, Madison Walker, Emily Rossi, Michael Tang, David Lightfoot, Robert Mann, Olli Saarela. The Effect of Recreational Cannabis Legalization on Substance Use, Mental Health, and Injury: A Systematic Review. PROSPERO 2024 Available from https://www.crd.york.ac.uk/PROSPERO/view/CRD42021265183

Published Article: Walker, M., Carpino, M., Lightfoot, D., Rossi, E., Tang, M., Mann, R., Saarela, O., & Cusimano, M. D. (2023). The effect of recreational cannabis legalization and commercialization on substance use, mental health, and injury: A systematic review. Public Health, 221, 87–96. https://doi.org/10.1016/j.puhe.2023.06.012

  • Scoping Review

Protocol: Harfield, S., Davy, C., Kite, E., McArthur, A., Munn, Z., Brown, N., & Brown, A. (2015). Characteristics of Indigenous primary health care models of service delivery: A scoping review protocol. JBI Evidence Synthesis, 13(11), 43. https://doi.org/10.11124/jbisrir-2015-2474

Published article: Harfield, S. G., Davy, C., McArthur, A., Munn, Z., Brown, A., & Brown, N. (2018). Characteristics of Indigenous primary health care service delivery models: A systematic scoping review. Globalization and Health, 14(1), 12. https://doi.org/10.1186/s12992-018-0332-2

 

 

Stage 2 Collecting Data & Screening

Knowledge syntheses require comprehensive searches across multiple databases. You need to search a minimum of two databases and typically, no more than five. Searching multiple databases is crucial for these reasons:

  • decreases the risk of missing relevant studies.
  • minimize risk of publication bias.
  • leads to more accurate and reliable conclusions because you've increased the likelihood of finding all pertinent studies.

The databases you choose will depend on your research question and the disciplines in which relevant research may be conducted. ​

The databases you choose will also depend on what is available to you. ​Most databases are subscription based.

Use the library Research Guides or Find a Database list to select databases relevant to your topic, or consult with a librarian.

Commonly used databases included the following:

  • CINAHL (EBSCO) - nursing and allied health
  • ERIC (Proquest) - education
  • Medline (OVID) - biomedical
  • PubMed (NLM) - biomedical
  • PsycINFO (Proquest) - psychology
  • Sociological Abstracts (Proquest) - sociology
  • Web of Science (Clarivate) - multidisciplinary

It is also recommended to set up individual accounts in each of the databases so that you can save your search strategies, edit them as needed, and rerun at later date, without having to enter the entire strategy again.  See the tutorials below about how to to do this in select platforms. 

Before you begin searching for articles, its important to plan for the documentation of your keyword brainstorming as well as your search strategies and results. You will need this information later for reporting and publication.  You can use tables in a Word document or spreadsheets in Excel to do this.

Keyword/concept templates
Log your searches

Include information about:

  • When you searched
  • Where your searched and why (e.g. Google Scholar, Web of Science, PsycINFO)
  • Search terms (e.g. text words or subject headings searched)
  • Limits (e.g. Language, Date, Peer Reviewed, Publication Type, Type of Research)
  • Results (number of articles found, number of articles saved/exported to citation manager)
  • Comments (e.g. combinations that were successful or not successful, new words identified)
Search results templates
Printing/saving database search strategies

Some databases have options to print or save the text of your search strategy to a Word, Rich Text, or PDF document. If a database doesn't have this option you can try copying and pasting the text or snipping an image of your strategy.  To permanently save search strategies so that you can create an alert or run them again at a later date, you need to have an account in that database. See the tutorials below about how to to do this in platforms on which many of our databases are provided. 

Search strategy development
  • Revisit your research question to structure your search and break it down into the main concepts. A question framework can help with this. 
  • Brainstorm and gather synonyms for your main concepts.
  • Use the key studies (seed articles) identified from your preliminary searching to help you gather synonyms. Look for these articles in databases to identify which subject headings and author keywords they've been tagged with. 
  • Mine knowledge syntheses with similar questions for keywords and subject headings used in their search strategies. Sometimes you'll find them in appendices or supplementary or additional information files. If using their exact strategy, remember to cite it. 
  • Map the terms you have listed to subject headings in the databases you plan to search. Medline, CINAHL, PsycINFO and ERIC are examples of databases that use subject headings (controlled vocabulary or thesaurus).
  • Construct a search string.
    • Your search string will use the Boolean operators AND and OR to combine terms and subject headings.  
    • You may need to use proximity operators and truncation. 
    • There may be useful database filters you can add to the search instead of using keywords and subject headings (e.g., age groups, methodology, gender, geography).

Search strategy tools 

Search filters

Consider using search filters (also called hedges). These are pre-constructed searches designed to search difficult concepts or concepts for which there are many search terms. They are usually database specific, but can be translated to work in different databses.  Add them to a search strategy where appropriate and remember to cite them.

Example filters: 

Canada and Provinces - CINAHL database - University of Alberta

[mh Canada] or Canad* or "British Columbia" or "Colombie Britannique" or Alberta* or Saskatcehwan or Manitoba* or Ontario or Quebec or "Nouveau Brunswick" or "New Brunswick" or "Nova Scotia" or "Nouvelle Ecosse" or "Prince Edward island" or Newfoundland or Labrador or Nunavut or NWT or "Northwest Territories" or Yukon or Nunavik or Inuvialuit

Qualitative Studies - Medline (OVID) - University of Alberta

exp qualitative research/ or grounded theory/ or exp nursing methodology research/ or (qualitative or ethnol$ or ethnog$ or ethnonurs$ or emic or etic or hermeneutic$ or phenomenolog$ or lived experience$ or (Grounded adj5 theor$) or content analys$ or thematic analys$ or narrative analys$ or metasynthes$ or meta-synthes$ or metasummar$ or meta-summar$ or metastud$ or meta-stud$ or meta-ethnog$ or metaethnog$ or meta-narrat$ or metanarrat$ or meta-interpret$ or metainterpret$ or (qualitative adj5 meta-analy$) or (qualitative adj5 metaanaly$) or action research or photovoice or photo voice).mp.

Review your searches

Once you have drafted your search and before executing it in all databases, it's a good idea to review it for completeness, typos, correct use of search tools, correct combination of search lines, etc.  A librarian can help with this or consult one of the following resources: 

Addressing Offensive Terms and Outdated Language

When creating thorough searches on certain topics, you might need to include some antiquated, non-standard, exclusionary, and possibly offensive terms to find older literature. The following article provides some guidance on how to address this: 

  • Townsend, Whitney; Anderson, Patricia; Capellari, Emily; Haines, Kate; Hansen, Sam; James, LaTeesa; MacEachern, Mark; Rana, Gurpreet; Saylor, Kate. Addressing antiquated, non-standard, exclusionary, and potentially offensive terms in evidence syntheses and systematic searches. 2022. https://dx.doi.org/10.7302/6408
Search guides
Video tutorials
Translating searches

Once you've completed a foundation search in one of the databases, it needs to be translated into all selected databases. 

Important: Results of each database search need to be recorded in your documentation for later reporting.

Citation Tracking 

Once you've completed  the selection of articles for inclusion in your review, you may wish to add citation tracking (sometimes known as snowballing) to supplement your search. This entails checking those articles' reference lists and list of citing articles. Most databases have a Cited By/Citing Articles function, but Google Scholar is probably the best place to find citing articles. Here are some resources that may help with this process: 

Updating Search ReUsults

Sometimes you may need to update database searches because a significant amount of time has passed between its initial execution and manuscript preparation or you're updating a previously published review. 

 

"Grey literature is information produced outside of traditional publishing and distribution channels, and can include reports, policy literature, working papers, newsletters, government documents, speeches, white papers, urban plans, and so on.

This information is often produced by organizations "on the ground" (such as government and inter-governmental agencies, non-governmental organisations, and industry) to store information and report on activities, either for their own use or wider sharing and distribution, and without the delays and restrictions of commercial and academic publishing. For that reason, grey literature can be more current than literature in scholarly journals."1

Examples of grey literature include: 

  • Technical reports from government agencies or scientific research groups.
  • White papers from organizations
  • Working papers from research institutions
  • Preprints of research articles not yet peer-reviewed
  • Policy documents and briefs.
  • Conference proceedings and presentations.
  • Theses and dissertations from graduate students.
  • Newsletters and bulletins from professional association
  • Committee reports and meeting minutes
  • Unpublished studies or manuscripts
Grey literature guides:
Grey literature search templates

References

1. Simon Fraser University Library. (2023). Grey Literature: What it is & how to find it. https://www.lib.sfu.ca/help/research-assistance/format-type/grey-literature

Important: Results of each database search need to be recorded in your documentation for later reporting.

Results of each database search need to be exported and saved for upload into a citation manager or review tool, for removing duplicates, screening and data extraction.  Remember, screening doesn't happen in the databases. 

Use the RIS file format when exporting results as it's the most compatible with a wide array of tools.

Exporting results

How you export articles varies between databases. Sometimes you need to set up an account in the database's platform (e.g., Proquest, Ebsco, OVID) in order to export a large number of citations. Below is a selection of library guides with instructions on how to do this. Most databases will also have instructions in their help sections. 

Removing Duplicates (aka Deduplication) 

Why do I need to remove duplicates?

  1. Some databases index the same journals. This means that when the search strategy is translated, there may be multiple copies of the same reference.
  2. The issue of multiple articles published from the same data set. This is harder to detect and may not apply to all types of syntheses. The Cochrane Handbook for Systematic Reviews discusses this. 

If undetected, either could create bias in the conclusions of your review. Removing duplicates is an integral step in the process. 

Where removing duplicate articles is concerned (i.e., matching articles vs duplicate use of data sets) some systematic review tools (e.g., Covidence and Rayyan) can automatically remove duplicate articles, but be aware they aren't perfect and will miss some. Screening tip: sorting the articles by title will help you identify missed duplicates. You can also remove duplicates with a citation management tool (e.g., Mendeley, Zotero, Endnote), which can give you more control over de-duplication. 

Important: No matter which deduplication approach you choose, don't delete duplicates and keep track of result totals before and after deduplication for later reporting. Some review tools will track this information.

Citation managers

Consider using a citation manager (e.g., Zotero, Mendeley, Endnote) to store and manage the records you find. It can assist you in organizing your records as you decide which ones to include in your review. Additionally, it is very effective for removing duplicate records. You can also link your own stored PDFs of your papers. Citation management software will also help you insert correctly formatted references into your final document and generate a reference list or bibliography.

 
Knowledge Synthesis Tools

These tools are specifically tailored to the needs of knowledge synthesis teams. In addition to reference management, some of these tools can also help with de-duplication, screening, data extraction, Prisma chart creation, analysis, track team progress, and facilitate communication between team members. Note that not every tool is appropriate for every kind of synthesis - be sure to choose the right fit for your project. 

The following resources provide lists or comparison charts of core features for a variety of review tools: 

  • Tools for Knowledge Synthesis - University of Toronto
  • Systematic Review Software Comparison - Univeristy of Hawaii
  • Kohl, C., McIntosh, E. J., Unger, S., Haddaway, N. R., Kecke, S., Schiemann, J., & Wilhelm, R. (2018). Online tools supporting the conduct and reporting of systematic reviews and systematic maps: A case study on CADIMA and review of existing tools. Environmental Evidence, 7(1), 8. https://doi.org/10.1186/s13750-018-0115-5

  • Cowie, K., Rahmatullah, A., Hardy, N., Holub, K., & Kallmes, K. (2022). Web-Based Software Tools for Systematic Literature Review in Medicine: Systematic Search and Feature Analysis. JMIR Medical Informatics, 10(5), e33219. https://doi.org/10.2196/33219

Process

Screening involves a two-step process where you assess each article to determine if it meets the eligibility criteria outlined in your protocol, deciding whether it should be part of your review.

To minimize bias, it's recommended (and sometimes required by guidelines) to have at least two reviewers for screening—yourself and another member of your review team. In addition, you should ideally have a third person assigned the task of resolving conflicts. 

Preparing for Screening

Screening articles can be challenging, especially if you're new to this research methodology.  It is recommended to pilot-test the screening process with reviewers to ensure the eligibility criteria are well defined and interpreted consistently.  Reviewers should have a high minimum of 90% agreement among them in their screening process. 

In their Screening for Articles guide, The University of Toronto provides excellent screening tips, as well as strategies to mitigate screening challenges. 

Step 1 Title/Abstract Screening

Includes the review of titles and abstracts to remove obviously irrelevant material. Typically you're not required to provide reason(s) for exclusion in this step. 

Step 2 Full-text Screening 

At this stage, you review the full-text to ensure it meets your eligibility criteria. You must provide reason(s) for exclusion in this step. 

Finding the full-text:
  • Search the article title in Omni
  • Verify that it's not available for free by searching Google or Google Scholar
  • Order the article through the library's resource sharing services.  This can be done through Omni.
  • Contact the author to request a copy

When you can't find the full-text, exclude the article and be sure to indicate this reason in your screening tool.  When you set up your exclusion reasons in the tool, "Full-text unavailable" should be one of them. 

Tools for Screening 

See the list of Knowledge Synthesis Tools in the Manage Results section.

 

 

 

Stage 3 Data Extraction & Analysis

Quality Assessment, sometimes referred to as Critical Appraisal, is an integral part of some review methodologies. It involves the assessment of several related features, including but not limited to:  risk of bias, quality of reporting, reproducibility, precision and external validity.

Checklists
Additional Resources

After the second screening stage, researchers need to read the full articles chosen for the review and pull out the important data using a form or template that has been designed to capture all of the detail relevant to the synthesis.

This form can be as detailed or simple as needed and can be coded for computer analysis if needed. 

Piloting Data Extraction
  • Train the review team to ensure members understand the data extraction form and to clarify the type of data expected for each category.
  • Establish standards
  • Pilot the extraction form using 5 to 10 randomly selected studies. This ensures data extractors record similar data.
  • Make necessary adjustments to the extraction form.
  • Document any changes to the process or form. Keep track of decisions made and the reasoning behind them.

Guides and Resources

 

Methods of analysis of data extracted varies between knowledge syntheses. Below are some resources that may help you with this process. 

Tools for Analysis

Stage 4 Report

Consult these guides to find information about reporting standards and tips on writing up your knowledge synthesis:

Timelines

How long does it take to complete a knowledge synthesis? The answer: it depends.  Generally speaking, knowledge synthesis research is time intensive and often projects span a year or more to completion.  Factors that effect the length of time it takes to do a knowledge synthesis: 

  • Topic
  • Availability of studies
  • Review methodology
  • Size of research team and whether they're working on the review full-time or part-time
  • Experience and required training
  • Levels of support available to the research team

The tables below provide some average timelines for different types of syntheses as well the stages in a knowledge synthesis project.  

Try PredicTER, a tool for that calculates the time requirements for various tasks involved in reviewing evidence, from planning and coordination to quantitative synthesis and reporting.

Time to complete by type of synthesis

Type Approximate
number of months 
Why Team
Literature review

1-6 

Bulk of time is spent reading resources & critically analyzing.

Highly adaptable to resources & timelines.

Typically one person
Mapping review 6-12

Time is divided between search & visual depiction of the field.

May require more time screening articles due to the larger volume of studies from covering a wider scope.

Typically one person.

Needs expertise in visualization tools.

Meta-analysis 18-24 Time is spent on all aspects of systematic review, as well as typically two stages of data analysis. Experience in advanced statistical methods.
Rapid review 1-8 

Time is divided between designing question/ search strategy, reviewing and evaluating sources, and summarizing.

Apply justified selection of methodology adjustments so as to meet resource limitations, especially time restrictions.

Typically requires multiple people to reduce bias, depending on type of review
Scoping review 2 - 24  Time is spent formulating question, developing a protocol, refining search, screening studies, summarizing and synthesizing relevant studies, reporting results and identifying gaps and/or trends.

Typical requires a minimum of two people. May require larger teams because of more search results to screen. May take longer than systematic review.

Systematic review 12-24 Time is spent defining research question, developing a protocol (inclusion/ exclusion criteria), testing, refining, and translating search strategies, title and abstract screening, full text screening, risk of bias assessment, data extraction & analysis. Minimum of two persons to reduce bias.

Table adapted from Evidence Synthesis, Library, University of Winnipeg.

Timeline for stages of a knowledge synthesis project

Month Task Description Stage
1-2 Formulate research question Refine and finalize your research question, ideally using a research question framework. Preparation
1-2 Check for existing evidence synthesis reviews on your topic Search for published evidence synthesis reviews or protocols that answer questions similar to yours Preparation
1-3 Develop foundational search strategy Retrieve and generate keywords and subject headings and incorporate appropriate syntax to create a search strategy which will find all relevant studies. You will need to translate your foundational search strategy to each database important to your topic.     Preparation
2-3 Write and register a protocol Document the method, timeline and responsibilities for your review. Register the protocol to create an immutable record of your planned project and enhance accountability. Preparation
3-5 Primary search Execute your search strategy in each database Collecting Data
3-5 Grey literature search If aligned with your inclusion criteria, search for grey literature such as conference proceedings, reports, dissertations and theses.     Collecting Data
4-5 Export and de- duplicate citations Export citations from your citation manager and remove duplicate citations Collecting Data
5-8 Title and abstract screening Screen titles and abstracts against your inclusion and exclusion criteria Selecting & Synthesizing
 
5-8 Obtain full-text articles Download from databases, request copies from inter-library loan, contact authors Selecting & Synthesizing
5-8 Full-text screening Exclude irrelevant articles based on inclusion/exclusion criteria Selecting & Synthesizing
 
9 Supplemental search Check reference lists of included articles for potential additional citations; new titles must go through screening process Collecting Data/Selecting & Synthesizing
9-11 Data extraction Extract all relevant data from each article and record for analysis Selecting & Synthesizing
9-11 Data synthesis If appropriate, convert extracted data to common measurements Selecting & Synthesizing
9-11 Data analysis Combine results from all included studies; analysis may be qualitative (thematic) or quantitative Selecting & Synthesizing
11 Re-Run Foundational Search Repeat the search to find new literature published since initial search. New articles must go through screening process Data collection/Selecting & Synthesizing
12 Write report of review Produce and disseminate final report of results Report

Table adapted from Knowledge Synthesis, Western Libraries.

Methodology resources by type

Websites
Articles & Books
Articles & Books

 

Articles
  • James, K. L., Randall, N. P., & Haddaway, N. R. (2016). A methodology for systematic mapping in environmental sciences. Environmental Evidence, 5(1), 7. https://doi.org/10.1186/s13750-016-0059-6
  • Miake-Lye, I. M., Hempel, S., Shanman, R., & Shekelle, P. G. (2016). What is an evidence map? A systematic review of published evidence maps and their definitions, methods, and products. Systematic Reviews, 5(1), 28. https://doi.org/10.1186/s13643-016-0204-x
  • Snilstveit, B., Vojtkova, M., Bhavsar, A., & Gaarder, M. (2013). Evidence Gap Maps—A Tool for Promoting Evidence-Informed Policy and Prioritizing Future Research (SSRN Scholarly Paper No. 2367606). Social Science Research Network. https://papers.ssrn.com/abstract=2367606

 

Articles & Books

 

Websites
Articles & Books
  • Hong, Q. N., Pluye, P., Bujold, M., & Wassef, M. (2017). Convergent and sequential synthesis designs: Implications for conducting and reporting systematic reviews of qualitative and quantitative evidence. Systematic Reviews, 6(1), 61. https://doi.org/10.1186/s13643-017-0454-2
  • Lizarondo, L., Stern, C., Carrier, J., Godfrey, C., Rieger, K., Salmond, S., Apostolo, J., Kirkpatrick, P., & Loveday, H. (2024). Mixed methods systematic reviews. In E. Aromataris, C. Lockwood, K. Porritt, B. Pilla, & Z. Jordan (Eds.), JBI Manual for Evidence Synthesis. JBI. https://doi.org/10.46658/JBIMES-24-07
  • Pearson, A., White, H., Bath-Hextall, F., Salmond, S., Apostolo, J., & Kirkpatrick, P. (2015). A mixed-methods approach to systematic reviews. International Journal of Evidence-Based Healthcare, 13(3), 121–131. https://doi.org/10.1097/XEB.0000000000000052
  • Petticrew, M., Rehfuess, E., Noyes, J., Higgins, J. P. T., Mayhew, A., Pantoja, T., Shemilt, I., & Sowden, A. (2013). Synthesizing evidence on complex interventions: How meta-analytical, qualitative, and mixed-method approaches can contribute. Journal of Clinical Epidemiology, 66(11), 1230–1243. https://doi.org/10.1016/j.jclinepi.2013.06.005
  • Pluye, P., & Hong, Q. N. (2014). Combining the Power of Stories and the Power of Numbers: Mixed Methods Research and Mixed Studies Reviews. Annual Review of Public Health, 35(1), 29–45. https://doi.org/10.1146/annurev-publhealth-032013-182440
  • Pluye, P., Hong, Q. N., Bush, P. L., & Vedel, I. (2016). Opening-up the definition of systematic literature review: The plurality of worldviews, methodologies and methods for reviews and syntheses. Journal of Clinical Epidemiology, 73, 2–5. https://doi.org/10.1016/j.jclinepi.2015.08.033
  • Sandelowski, M., Leeman, J., Knafl, K., & Crandell, J. L. (2013). Text-in-Context: A Method for Extracting Findings in Mixed-Methods Mixed Research Synthesis Studies. Journal of Advanced Nursing, 69(6), 1428–1437. https://doi.org/10.1111/jan.12000
  • Stern, C., Lizarondo, L., Carrier, J., Godfrey, C., Rieger, K., Salmond, S., Apóstolo, J., Kirkpatrick, P., & Loveday, H. (2020). Methodological guidance for the conduct of mixed methods systematic reviews. JBI Evidence Synthesis, 18(10), 2108. https://doi.org/10.11124/JBISRIR-D-19-00169
  • Stern, C., Lizarondo, L., Carrier, J., Godfrey, C., Rieger, K., Salmond, S., Apostolo, J., Kirkpatrick, P., & Loveday, H. (2021). Methodological guidance for the conduct of mixed methods systematic reviews. JBI Evidence Implementation, 19(2), 120–129. https://doi.org/10.1097/XEB.0000000000000282

 

 

 

Articles & Books
  • Dobbins, M. (2017). Rapid Review Guidebook. Natlonal Collaborating Centre for  Methods and Tools, School of Nursing, McMaster University.
  • Garritty, C., Gartlehner, G., Nussbaumer-Streit, B., King, V. J., Hamel, C., Kamel, C., Affengruber, L., & Stevens, A. (2021). Cochrane Rapid Reviews Methods Group offers evidence-informed guidance to conduct rapid reviews. Journal of Clinical Epidemiology, 130, 13–22. https://doi.org/10.1016/j.jclinepi.2020.10.007
  • Garritty, C., Hamel, C., Trivella, M., Gartlehner, G., Nussbaumer-Streit, B., Devane, D., Kamel, C., Griebler, U., & King, V. J. (2024). Updated recommendations for the Cochrane rapid review methods guidance for rapid reviews of effectiveness. BMJ, 384, e076335. https://doi.org/10.1136/bmj-2023-076335
  • Guo, Q., Jiang, G., Zhao, Q., Long, Y., Feng, K., Gu, X., Xu, Y., Li, Z., Huang, J., & Du, L. (n.d.). Rapid review: A review of methods and recommendations based on current evidence. Journal of Evidence-Based Medicine, n/a(n/a). https://doi.org/10.1111/jebm.12594
  • Haby, M. M., Barreto, J. O. M., Kim, J. Y. H., Peiris, S., Mansilla, C., Torres, M., Guerrero-Magaña, D. E., & Reveiz, L. (2024). What are the best methods for rapid reviews of the research evidence? A systematic review of reviews and primary studies. Research Synthesis Methods, 15(1), 2–20. https://doi.org/10.1002/jrsm.1664
  • Hamel, C., Michaud, A., Thuku, M., Skidmore, B., Stevens, A., Nussbaumer-Streit, B., & Garritty, C. (2021). Defining Rapid Reviews: A systematic scoping review and thematic analysis of definitions and defining characteristics of rapid reviews. Journal of Clinical Epidemiology, 129, 74–85. https://doi.org/10.1016/j.jclinepi.2020.09.041
  • Smela, B., Toumi, M., Świerk, K., Francois, C., Biernikiewicz, M., Clay, E., & Boyer, L. (n.d.). Rapid literature review: Definition and methodology. Journal of Market Access & Health Policy, 11(1), 2241234. https://doi.org/10.1080/20016689.2023.2241234
  • The production of quick scoping reviews and rapid evidence assessments. (2016.). GOV.UK.  
  • Tricco, A. C., Khalil, H., Holly, C., Feyissa, G., Godfrey, C., Evans, C., Sawchuck, D., Sudhakar, M., Asahngwa, C., Stannard, D., Abdulahi, M., Bonnano, L., Aromataris, E., McInerney, P., Wilson, R., Pang, D., Wang, Z., Cardoso, A. F., Peters, M. D. J., … Munn, Z. (2022). Rapid reviews and the methodological rigor of evidence synthesis: A JBI position statement. JBI Evidence Synthesis, 20(4), 944. https://doi.org/10.11124/JBIES-21-00371
  • Wollscheid, S., & Tripney, J. (2021). Rapid reviews as an emerging approach to evidence synthesis in education. London Review of Education, 19(1), Article 1. https://doi.org/10.14324/LRE.19.1.32

Rapid review method series - BMJ Evidence-Based Medicine

  • Affengruber, L., Nussbaumer-Streit, B., Hamel, C., Maten, M. V. der, Thomas, J., Mavergames, C., Spijker, R., & Gartlehner, G. (2024). Rapid review methods series: Guidance on the use of supportive software. BMJ Evidence-Based Medicine, 29(4), 264–271. https://doi.org/10.1136/bmjebm-2023-112530
  • Booth, A., Sommer, I., Noyes, J., Houghton, C., & Campbell, F. (2024). Rapid reviews methods series: Guidance on rapid qualitative evidence synthesis. BMJ Evidence-Based Medicine. https://doi.org/10.1136/bmjebm-2023-112620
  • Campbell, F., Sutton, A., Pollock, D., Garritty, C., Tricco, A. C., Schmidt, L., & Khalil, H. (2025). Rapid reviews methods series (paper 7): Guidance on rapid scoping, mapping and evidence and gap map (‘Big Picture Reviews’). BMJ Evidence-Based Medicine. https://doi.org/10.1136/bmjebm-2023-112389
  • Garritty, C., Nussbaumer-Streit, B., Hamel, C., & Devane, D. (2024). Rapid reviews methods series: Assessing the appropriateness of conducting a rapid review. BMJ Evidence-Based Medicine. https://doi.org/10.1136/bmjebm-2023-112722
  • Garritty, C., Tricco, A. C., Smith, M., Pollock, D., Kamel, C., & King, V. J. (2024). Rapid Reviews Methods Series: Involving patient and public partners, healthcare providers and policymakers as knowledge users. BMJ Evidence-Based Medicine, 29(1), 55–61. https://doi.org/10.1136/bmjebm-2022-112070
  • Gartlehner, G., Nussbaumer-Streit, B., Devane, D., Kahwati, L., Viswanathan, M., King, V. J., Qaseem, A., Akl, E., & Schuenemann, H. J. (2024). Rapid reviews methods series: Guidance on assessing the certainty of evidence. BMJ Evidence-Based Medicine, 29(1), 50–54. https://doi.org/10.1136/bmjebm-2022-112111
  • King, V. J., Nussbaumer-Streit, B., Shaw, E., Devane, D., Kahwati, L., Viswanathan, M., & Gartlehner, G. (2024). Rapid reviews methods series: Considerations and recommendations for evidence synthesis in rapid reviews. BMJ Evidence-Based Medicine, 29(6), 419–422. https://doi.org/10.1136/bmjebm-2023-112617
  • Klerings, I., Robalino, S., Booth, A., Escobar-Liquitay, C. M., Sommer, I., Gartlehner, G., Devane, D., & Waffenschmidt, S. (2023). Rapid reviews methods series: Guidance on literature search. BMJ Evidence-Based Medicine, 28(6), 412–417. https://doi.org/10.1136/bmjebm-2022-112079
  • Nussbaumer-Streit, B., Sommer, I., Hamel, C., Devane, D., Noel-Storr, A., Puljak, L., Trivella, M., & Gartlehner, G. (2023). Rapid reviews methods series: Guidance on team considerations, study selection, data extraction and risk of bias assessment. BMJ Evidence-Based Medicine, 28(6), 418–423. https://doi.org/10.1136/bmjebm-2022-112185
  • Stevens, A., Hersi, M., Garritty, C., Hartling, L., Shea, B. J., Stewart, L. A., Welch, V. A., & Tricco, A. C. (2024). Rapid review method series: Interim guidance for the reporting of rapid reviews. BMJ Evidence-Based Medicine. https://doi.org/10.1136/bmjebm-2024-112899

 

 

Articles & Books
  • Booth, A., Briscoe, S., & Wright, J. M. (2020). The “realist search”: A systematic scoping review of current practice and reporting. Research Synthesis Methods, 11(1), 14–35. https://doi.org/10.1002/jrsm.1386
  • Emmel, N., Greenhalgh, J., Manzano, A., Monaghan, M., & Dalkin, S. (2018). Doing realist research (1st ed.). SAGE Publications Ltd.
  • Grootel, L. van, Wesel, F. van, O’Mara‐Eves, A., Thomas, J., Hox, J., & Boeije, H. (2017). Using the realist perspective to link theory from qualitative evidence synthesis to quantitative studies: Broadening the matrix approach. Research Synthesis Methods, 8(3), 303–311. https://doi.org/10.1002/jrsm.1241
  • Pawson, R. (2006). Evidence-based policy: A realist perspective. SAGE.
  • Pawson, R., Greenhalgh, T., Harvey, G., & Walshe, K. (2005). Realist review—A new method of systematic review designed for complex policy interventions. Journal of Health Services Research & Policy, 10 Suppl 1, 21–34. https://doi.org/10.1258/1355819054308530
  • Rycroft-Malone, J., McCormack, B., Hutchinson, A. M., DeCorby, K., Bucknall, T. K., Kent, B., Schultz, A., Snelgrove-Clarke, E., Stetler, C. B., Titler, M., Wallin, L., & Wilson, V. (2012). Realist synthesis: Illustrating the method for implementation research. Implementation Science, 7(1), 33. https://doi.org/10.1186/1748-5908-7-33
  • Saul, J. E., Willis, C. D., Bitz, J., & Best, A. (2013). A time-responsive tool for informing policy making: Rapid realist review. Implementation Science, 8(1), 103. https://doi.org/10.1186/1748-5908-8-103
  • Wong, G., Greenhalgh, T., Westhorp, G., Buckingham, J., & Pawson, R. (2013). RAMESES publication standards: Realist syntheses. BMC Medicine, 11, 21. https://doi.org/10.1186/1741-7015-11-21
  • Wong, G., Westhorp, G., Manzano, A., Greenhalgh, J., Jagosh, J., & Greenhalgh, T. (2016). RAMESES II reporting standards for realist evaluations. BMC Medicine, 14(1), 96. https://doi.org/10.1186/s12916-016-0643-1

 

Standard
Articles & Books
  • Arksey, H., & O’Malley, L. (2005). Scoping studies: Towards a methodological framework. International Journal of Social Research Methodology, 8(1), 19–32. https://doi.org/10.1080/1364557032000119616
  • Khalil, H., Peters, M. DJ., Tricco, A. C., Pollock, D., Alexander, L., McInerney, P., Godfrey, C. M., & Munn, Z. (2021). Conducting high quality scoping reviews-challenges and solutions. Journal of Clinical Epidemiology, 130, 156–160. https://doi.org/10.1016/j.jclinepi.2020.10.009
  • Levac, D., Colquhoun, H., & O’Brien, K. K. (2010). Scoping studies: Advancing the methodology. Implementation Science: IS, 5, 69. https://doi.org/10.1186/1748-5908-5-69
  • Munn, Z., Pollock, D., Khalil, H., Alexander, L., Mclnerney, P., Godfrey, C. M., Peters, M., & Tricco, A. C. (2022). What are scoping reviews? Providing a formal definition of scoping reviews as a type of evidence synthesis. JBI Evidence Synthesis, 20(4), 950. https://doi.org/10.11124/JBIES-21-00483
  • Peters, M. D. J., Godfrey, C. M., Khalil, H., McInerney, P., Parker, D., & Soares, C. B. (2015). Guidance for conducting systematic scoping reviews. JBI Evidence Implementation, 13(3), 141. https://doi.org/10.1097/XEB.0000000000000050
  • Peters, M. D. J., Marnie, C., Tricco, A. C., Pollock, D., Munn, Z., Alexander, L., McInerney, P., Godfrey, C. M., & Khalil, H. (2020). Updated methodological guidance for the conduct of scoping reviews. JBI Evidence Synthesis, 18(10), 2119. https://doi.org/10.11124/JBIES-20-00167
  • Pham, M. T., Rajić, A., Greig, J. D., Sargeant, J. M., Papadopoulos, A., & McEwen, S. A. (2014). A scoping review of scoping reviews: Advancing the approach and enhancing the consistency. Research Synthesis Methods, 5(4), 371–385. https://doi.org/10.1002/jrsm.1123
  • Phillips-Beck, W., Bukich, B. L. J., Thiessen, K., Lavoie, J. G., Schultz, A., Sanguins, J., Beck, G., Longclaws, B., Shingoose, G., Palmer, M., Linton, J., Negash, B., & Morriseau, T. (2024). An Indigenous-informed scoping review study methodology: Advancing the science of scoping reviews. Systematic Reviews, 13(1), 181. https://doi.org/10.1186/s13643-024-02586-1
  • Pollock, D., Davies, E. L., Peters, M. D. J., Tricco, A. C., Alexander, L., McInerney, P., Godfrey, C. M., Khalil, H., & Munn, Z. (2021). Undertaking a scoping review: A practical guide for nursing and midwifery students, clinicians, researchers, and academics. Journal of Advanced Nursing, 77(4), 2102–2113. https://doi.org/10.1111/jan.14743
  • Pollock, D., Peters, M. D. J., Khalil, H., McInerney, P., Alexander, L., Tricco, A. C., Evans, C., de Moraes, É. B., Godfrey, C. M., Pieper, D., Saran, A., Stern, C., & Munn, Z. (2023). Recommendations for the extraction, analysis, and presentation of results in scoping reviews. JBI Evidence Synthesis, 21(3), 520. https://doi.org/10.11124/JBIES-22-00123
  • Tricco, A. C., Lillie, E., Zarin, W., O’Brien, K., Colquhoun, H., Kastner, M., Levac, D., Ng, C., Sharpe, J. P., Wilson, K., Kenny, M., Warren, R., Wilson, C., Stelfox, H. T., & Straus, S. E. (2016). A scoping review on the conduct and reporting of scoping reviews. BMC Medical Research Methodology, 16(1), 15. https://doi.org/10.1186/s12874-016-0116-4

 

 

 

Articles & Books
  • Barry, E. S., Merkebu, J., & Varpio, L. (2022a). How to Conduct a State-of-the-Art Literature Review. Journal of Graduate Medical Education, 14(6), 663–665. https://doi.org/10.4300/JGME-D-22-00704.1
  • Barry, E. S., Merkebu, J., & Varpio, L. (2022b). State-of-the-art literature review methodology: A six-step approach for knowledge synthesis. Perspectives on Medical Education, 11(5), 281–288. https://doi.org/10.1007/s40037-022-00725-9
  • Barry, E. S., Merkebu, J., & Varpio, L. (2022c). Understanding State-of-the-Art Literature Reviews. Journal of Graduate Medical Education, 14(6), 659–662. https://doi.org/10.4300/JGME-D-22-00705.1

 

Articles & Books
Health Sciences
  • Aromataris, E., & Pearson, A. (2014). The Systematic Review: An Overview. AJN The American Journal of Nursing, 114(3), 53–58. https://doi.org/10.1097/01.NAJ.0000444496.24228.2c
  • Bowden, V. R. (2021). Types of Reviews -- Part 1: Systematic Reviews. Pediatric Nursing, 47(6), 301–304.
  • Butler, A., Hall, H., & Copnell, B. (2016). A Guide to Writing a Qualitative Systematic Review Protocol to Enhance Evidence-Based Practice in Nursing and Health Care. Worldviews on Evidence-Based Nursing, 13(3), 241–249. https://doi.org/10.1111/wvn.12134
  • Clarke, J. (2011). What is a systematic review? Evidence-Based Nursing, 14(3), 64–64. https://doi.org/10.1136/ebn.2011.0049
  • Gomersall, J. S., Jadotte, Y. T., Xue, Y., Lockwood, S., Riddle, D., & Preda, A. (2015). Conducting systematic reviews of economic evaluations. JBI Evidence Implementation, 13(3), 170. https://doi.org/10.1097/XEB.0000000000000063
  • Greve, B. (2017). Handbook of Social Policy Evaluation. Edward Elgar Publishing.
  • Haddaway, N. R., Woodcock, P., Macura, B., & Collins, A. (2015). Making literature reviews more reliable through application of lessons from systematic reviews. Conservation Biology, 29(6), 1596–1605. https://doi.org/10.1111/cobi.12541
  • Higgins, J. P. T. & Wiley InterScience (Online service) (Eds.). (2019). Cochrane handbook for systematic reviews of interventions (Second edition). John Wiley & Sons Ltd.
  • Lavallée, M., Robillard, P.-N., & Mirsalari, R. (2014). Performing Systematic Literature Reviews with Novices: An Iterative Approach. IEEE Transactions on Education, 57(3), 175–181.
  • Muka, T., Glisic, M., Milic, J., Verhoog, S., Bohlius, J., Bramer, W., Chowdhury, R., & Franco, O. H. (2020). A 24-step guide on how to design, conduct, and successfully publish a systematic review and meta-analysis in medical research. European Journal of Epidemiology, 35(1), 49–60. https://doi.org/10.1007/s10654-019-00576-5
  • Munn, Z., Moola, S., Lisy, K., Riitano, D., & Tufanaru, C. (2015). Methodological guidance for systematic reviews of observational epidemiological studies reporting prevalence and cumulative incidence data. JBI Evidence Implementation, 13(3), 147. https://doi.org/10.1097/XEB.0000000000000054
  • Munn, Z., Stern, C., Aromataris, E., Lockwood, C., & Jordan, Z. (2018). What kind of systematic review should I conduct? A proposed typology and guidance for systematic reviewers in the medical and health sciences. BMC Medical Research Methodology, 18(1), 5. https://doi.org/10.1186/s12874-017-0468-4
  • Page, M. J., McKenzie, J. E., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., Shamseer, L., Tetzlaff, J. M., Akl, E. A., Brennan, S. E., Chou, R., Glanville, J., Grimshaw, J. M., Hróbjartsson, A., Lalu, M. M., Li, T., Loder, E. W., Mayo-Wilson, E., McDonald, S., … Moher, D. (2021). The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. Systematic Reviews, 10(1), 89. https://doi.org/10.1186/s13643-021-01626-4
  • RN, C. H., EdD, FAAN, S. S., EdD, RN, & RN, M. S., PharmD, MLIS, MSN. (2016). Comprehensive Systematic Review for Advanced Practice Nursing, Second Edition. Springer Publishing Company.

 

Social Sciences
  • Andrews, R. (2005). The Place of Systematic Reviews in Education Research. British Journal of Educational Studies, 53(4), 399–416. JSTOR.
  • Badger, D., Nursten, J., Williams, P., & Woodward, M. (2000). Should All Literature Reviews Be Systematic? Evaluation and Research in Education, 14, 220–230.
  • Bearman, M., Smith, C. D., Carbone, A., Slade, S., Baik, C., Hughes-Warrington, M., & Neumann, D. L. (2012). Systematic Review Methodology in Higher Education. Higher Education Research and Development, 31(5), 625–640. https://doi.org/10.1080/07294360.2012.702735
  • Beelmann, A. (2014). Potentials and limits of the systematic accumulation of evidence via systematic research synthesis within educational research. Zeitschrift Fur Erziehungswissenschaft, 17, 55–78. https://doi.org/10.1007/s11618-014-0509-2
  • Campbell, A., Taylor, B., Bates, J., & O’Connor-Bones, U. (2018). Developing and Applying a Protocol for a Systematic Review in the Social Sciences. New Review of Academic Librarianship, 24(1), 1–22. https://doi.org/10.1080/13614533.2017.1281827
  • Chapman, K. (2021). Characteristics of systematic reviews in the social sciences. The Journal of Academic Librarianship, 47(5), 102396. https://doi.org/10.1016/j.acalib.2021.102396
  • Siddaway, A. P., Wood, A. M., & Hedges, L. V. (2019). How to Do a Systematic Review: A Best Practice Guide for Conducting and Reporting Narrative Reviews, Meta-Analyses, and Meta-Syntheses. Annual Review of Psychology, 70(Volume 70, 2019), 747–770. https://doi.org/10.1146/annurev-psych-010418-102803
  • Slebodnik, M., Pardon, K., & Hermer, J. (2022). Who’s Publishing Systematic Reviews? An Examination Beyond the Health Sciences. Issues in Science and Technology Librarianship, 101, Article 101. https://doi.org/10.29173/istl2671
  • Xiao, Y., & Watson, M. (2019). Guidance on Conducting a Systematic Literature Review. Journal of Planning Education and Research, 39(1), 93–112. https://doi.org/10.1177/0739456X17723971
  • Zawacki-Richter, O., Kerres, M., Bedenlier, S., Bond, M., & Buntins, K. (2019). Systematic Reviews in Educational Research. Springer Nature.

 

Articles & Books

 

Methodology resources by discipline (in progress)

Articles
Articles
  • Bergeron, D. A., Tremblay, M.-C., Dogba, M. J., Martin, D., & McGavock, J. (2021). The use of realist approaches for health research in Indigenous communities. AlterNative: An International Journal of Indigenous Peoples, 17(1), 106–110. https://doi.org/10.1177/1177180121996063
  • Gribble, M. O., & Him, D. M. A. (2014). Ethics and Community Involvement in Syntheses Concerning American Indian, Alaska Native, or Native Hawaiian Health: A Systematic Review. AJOB Empirical Bioethics. https://www.tandfonline.com/doi/abs/10.1080/21507716.2013.848956
  • Harding, L., Marra, C. J., & Illes, J. (2021). Establishing a comprehensive search strategy for Indigenous health literature reviews. Systematic Reviews, 10(1), 115. https://doi.org/10.1186/s13643-021-01664-y
  • McDonald, E., Priest, N., Doyle, J., Bailie, R., Anderson, I., & Waters, E. (2010). Issues and challenges for systematic reviews in indigenous health. Journal of Epidemiology & Community Health, 64(7), 643–644. https://doi.org/10.1136/jech.2008.077503
  • Phillips-Beck, W., Bukich, B. L. J., Thiessen, K., Lavoie, J. G., Schultz, A., Sanguins, J., Beck, G., Longclaws, B., Shingoose, G., Palmer, M., Linton, J., Negash, B., & Morriseau, T. (2024). An Indigenous-informed scoping review study methodology: Advancing the science of scoping reviews. Systematic Reviews, 13(1), 181. https://doi.org/10.1186/s13643-024-02586-1
Webinars

 

 

Standards
Articles & Books
  • Bayliss, H. R., Haddaway, N. R., Eales, J., Frampton, G. K., & James, K. L. (2016). Updating and amending systematic reviews and systematic maps in environmental management. Environmental Evidence, 5(1), 20. https://doi.org/10.1186/s13750-016-0073-8
  • Doerr, E. D., Dorrough, J., Davies, M. J., Doerr, V. A. J., & McIntyre, S. (2015). Maximizing the value of systematic reviews in ecology when data or resources are limited. Austral Ecology, 40(1), 1–11. https://doi.org/10.1111/aec.12179
  • Foo, Y. Z., O’Dea, R. E., Koricheva, J., Nakagawa, S., & Lagisz, M. (2021). A practical guide to question formation, systematic searching and study screening for literature reviews in ecology and evolution. Methods in Ecology and Evolution, 12(9), 1705–1720. https://doi.org/10.1111/2041-210X.13654
  • Gurevitch, J., Curtis, P. S., & Jones, M. H. (2001). Meta-analysis in ecology. In Advances in Ecological Research (Vol. 32, pp. 199–247). Academic Press. https://doi.org/10.1016/S0065-2504(01)32013-5
  • Haddaway, N. R., Bethel, A., Dicks, L. V., Koricheva, J., Macura, B., Petrokofsky, G., Pullin, A. S., Savilaakso, S., & Stewart, G. B. (2020). Eight problems with literature reviews and how to fix them. Nature Ecology & Evolution, 4(12), 1582–1589. https://doi.org/10.1038/s41559-020-01295-x
  • Haddaway, N. R., Kohl, C., Rebelo da Silva, N., Schiemann, J., Spök, A., Stewart, R., Sweet, J. B., & Wilhelm, R. (2017). A framework for stakeholder engagement during systematic reviews and maps in environmental management. Environmental Evidence, 6(1), 11. https://doi.org/10.1186/s13750-017-0089-8
  • Haddaway, N. R., Macura, B., Whaley, P., & Pullin, A. S. (2018). ROSES RepOrting standards for Systematic Evidence Syntheses: Pro forma, flow-diagram and descriptive summary of the plan and conduct of environmental systematic reviews and systematic maps. Environmental Evidence, 7(1), 7. https://doi.org/10.1186/s13750-018-0121-7
  • Haddaway, N. R., & Westgate, M. J. (2019). Predicting the time needed for environmental systematic reviews and systematic maps. Conservation Biology, 33(2), 434–443. https://doi.org/10.1111/cobi.13231
  • Haddaway, N. r., Woodcock, P., Macura, B., & Collins, A. (2015). Making literature reviews more reliable through application of lessons from systematic reviews. Conservation Biology, 29(6), 1596–1605. https://doi.org/10.1111/cobi.12541
  • Kohl, C., McIntosh, E. J., Unger, S., Haddaway, N. R., Kecke, S., Schiemann, J., & Wilhelm, R. (2018). Online tools supporting the conduct and reporting of systematic reviews and systematic maps: A case study on CADIMA and review of existing tools. Environmental Evidence, 7(1), 8. https://doi.org/10.1186/s13750-018-0115-5
  • Koricheva, J., & Gurevitch, J. (2014). Uses and misuses of meta-analysis in plant ecology. Journal of Ecology, 102(4), 828–844. https://doi.org/10.1111/1365-2745.12224
  • Koricheva, J., Gurevitch, J., & Mengersen, K. (2013). Handbook of Meta-analysis in Ecology and Evolution. Princeton University Press. 
  • Livoreil, B., Glanville, J., Haddaway, N. R., Bayliss, H., Bethel, A., de Lachapelle, F. F., Robalino, S., Savilaakso, S., Zhou, W., Petrokofsky, G., & Frampton, G. (2017). Systematic searching for environmental evidence using multiple tools and sources. Environmental Evidence, 6(1), 23. https://doi.org/10.1186/s13750-017-0099-6
  • McDougall, R. (2015). Reviewing literature in bioethics research: Increasing rigour in non-systematic reviews. Bioethics, 29(7), 523–528. https://doi.org/10.1111/bioe.12149 
  • McKinnon, M. C., Cheng, S. H., Garside, R., Masuda, Y. J., & Miller, D. C. (2015). Sustainability: Map the evidence. Nature, 528(7581), 185–187. https://doi.org/10.1038/528185a
  • Nakagawa, S., & Santos, E. S. A. (2012). Methodological issues and advances in biological meta-analysis. Evolutionary Ecology, 26(5), 1253–1274. https://doi.org/10.1007/s10682-012-9555-5
  • Nakagawa, S., Yang, Y., Macartney, E. L., Spake, R., & Lagisz, M. (2023). Quantitative evidence synthesis: A practical guide on meta-analysis, meta-regression, and publication bias tests for environmental sciences. Environmental Evidence, 12(1), 8. https://doi.org/10.1186/s13750-023-00301-6
  • O’Dea, R. E., Lagisz, M., Jennions, M. D., Koricheva, J., Noble, D. W. A., Parker, T. H., Gurevitch, J., Page, M. J., Stewart, G., Moher, D., & Nakagawa, S. (2021). Preferred reporting items for systematic reviews and meta-analyses in ecology and evolutionary biology: A PRISMA extension. Biological Reviews, 96(5), 1695–1722. https://doi.org/10.1111/brv.12721
  • Pullin, A. S., Frampton, G. K., Livoreil, B., & Petrokofsky, G. (2022). Guidelines and Standards for Evidence Synthesis in Environmental Management Version 5.1. Collaboration for Environmental Evidence. https://environmentalevidence.org/information-for-authors/guidelines-for-authors/