The quality of health technology assessment (HTA) reports depends on many factors. One of these factors is the evidence base from which the HTA is derived. The evidence base is created by gathering information from many sources and performing literature searches. Performing a high-quality search of information resources will ensure the accuracy and completeness of the evidence base used in HTA reports. Currently, no review exists to tell us what elements of the search process have the most impact on the overall quality of the resulting evidence base.
- To identify the elements associated with the accuracy and completeness of the evidence base found using electronic search strategies in different topic areas and apply this knowledge to HTA reports.
- To determine the impact of errors in the elements of the electronic search strategy on the resulting evidence base.
- To propose enhancements in the methods used for creating and evaluating search strategies to directly and positively affect the applicability of HTA reports.
With the goal of developing and validating a process of peer review for electronic search strategies, we considered tools that were developed in other areas of information retrieval that might serve as a basis for peer-reviewing search strategies.
A systematic review, web-based survey, and peer-review forums were performed. The systematic review was conducted to identify evidence related to quality issues and errors in complex electronic search strategies. Evidence was considered from any context, not only from research in systematic reviews and HTA searching.
The databases searched included Library and Information Science Abstracts (LISA, CSA interface)1969 to May 2005; Cochrane Methodology Register and Cochrane Methodology Reviews (completed reviews only, The Cochrane Library 2005, Issue 2, Wiley Interface); MEDLINE (OVID interface)1966 to the first week of June, 2005; PsycINFO (OVID interface) 1806 to the second week of June, 2005; Cumulative Index to Nursing and Allied Health Literature (CINAHL), (OVID interface) 1982 to the second week of June, 2005; Health STAR (OVID interface) 1987 to May 2005; and Health and Psychosocial Instruments (HaPI) (OVID interface) 1985 to March 2005. Efforts were also made to identify grey literature.
Because of the anticipated paucity of research evidence in some aspects of the electronic search, a web-based survey of expert searchers in systematic reviews and library and information studies was undertaken. The aim of the survey was to gather experts' opinions regarding the impact of search elements on the search results and the importance of each element in the peer review of electronic search strategies. The survey was conducted after the systematic review was completed, so that elements that were identified as potentially important in the review could be addressed in the survey. The original 14 elements studied in the review and five additional elements identified during it were included in the survey. After this, two peer-review forums were held to discuss the results of the systematic review and survey.
A systematic review identified evidence on the importance of 14 of the 19 elements of the electronic search strategy that were initially considered. No evidence was found for two elements and from the three remaining elements, one additional element emerged as a result of the review. Although 26 tools were identified that could be used as checklists, none were validated for assessing electronic search strategies. Of these tools, 10 examine the conduct or reporting of the entire search (not just the electronic component).
Opinions were sought through a web-based survey for the elements that were considered in the systematic review. Fifty-eight respondents completed the survey. The elements were ranked into three tiers of importance based on an assessment of the potential impact of the elements on recall and precision. Elements that were rated as unimportant in peer review were dropped from further consideration. Based on the evidence of our findings from the systematic review, survey, and peer-review forums, a process for validating the search strategy using a checklist and a peer-review process was developed.
This work fills a gap in the assurance of the methodological quality of systematic reviews by contributing an evidence-based scale for the peer review of the electronic search strategy. The project has received support and participation from the information science community, and this approach to the peer review of search strategies has been supported by the Cochrane Collaboration's Information Retrieval Methods Group. A validated process — both transparent and robust — for peer-reviewing search strategies will improve the retrieval of the relevant information that forms the evidence base.