Begin main content

Methods for Development

Date: April 28, 2011
Result type: Resources

The following sections summarize the methods for development. Additional detail on these methods can be found at the following links:

Searching for Reviews

Detailed electronic and handsearches were performed to identify potentially relevant systematic reviews. Selection criteria used in the identification of key systematic reviews included methodological quality and currency of the research.   

Methods for Data Collection and Quality Assessment of Systematic Reviews

A standardized data extraction form was developed and refined through consultation and piloting. The form summarized evidence, methodological quality, and key characteristics for each review.

AMSTAR, A MeaSurement Tool to Assess Reviews, was used to assess systematic review quality. This tool provides an overall quality rating on a scale of 0 to 11, where 11 represents a review of the highest quality. Categories of quality were determined, as follows: low (score 0 to 3), medium (score 4 to 7), and high (score 8 to 11).

Data collection for interventions targeting Health Care Professionals was performed by two reviewers.  Adjudication of discrepancies was handled by one of two senior reviewers, who also compiled the final dataset. Quality assessment was also performed by two reviewers, with adjudication of discrepancies handled by one of two senior reviewers.

Data collection for interventions targeting Consumers was performed by one reviewer. A second reviewer verified the data abstraction. The quality assessment was performed by one reviewer, with a second reviewer verifying the assessment. Any differences for the data collection or the quality assessment were resolved by discussion.

* In the 2013 April update, a single reviewer performed the data collection activities with limited assistance from a second reviewer. Quality assessment was also performed by a single reviewer.

Reviews scoring AMSTAR 3+ were assessed by a second reviewer and differences in scoring between reviewers were resolved by discussion or third party adjudication if necessary.

Data Synthesis, Presentation, and Rating

Individual Review Summaries

Results of the included reviews were analyzed, summarized, and reported quantitatively and descriptively. Data were organized by overall results and results related to prescribing. Results were broadly reported as:

  • vote counting plus reporting of absolute effect measures (with or without a measure of variability)
  • vote counting plus reporting of relative effect measures (with or without a measure of variability)
  • vote counting alone, by direction or statistical significance, depending on available information.

To standardize the reporting of prescribing-related outcomes, the following categorizations were developed and used: concordance, appropriate use (appropriate use — dosage;  appropriate use — choice; appropriate use — route of administration), cost containment, and other.

Standardized statements and decision rules were used for reporting the evidence for the Results, Conclusions, and Effectiveness sections of each review summary.

Reviews listed as ‘summary pending’ will be analyzed, summarized and reported at a later date and the findings will then be incorporated in the overall evidence summaries of the interventions they address.

Summaries of Intervention Classification Categories

To improve the usefulness of this database, overall summaries representing the evidence reported among reviews for a common intervention are provided. Standardized statements and decision rules were used for producing these summaries.

Structure of the Database

Summaries of interventions and individual reviews were produced based on a template. Decisions about the amount and type of information presented were made jointly by the CADTH and EPOC teams.

A list of the individual studies included in each review is provided as a link from the individual review summary page. This list may not be comprehensive for each review. Detailed information on the fields available within each summary, as well as methodology and reasons for missing references, is available in the detailed methodology sections for Professionals and Consumers.

Study Team  

Dr. Jeremy Grimshaw, Principal Investigator
Dr. Sophie Hill, Senior Advisor (Consumers)
Dianne Lowe
Caroline Kaufman
Alain Mayhew 
Michelle Fiander

Misty Pratt
Julia Worswick
Julie Wu
Sharlini Yogasingam

Related Information