Methodological search filter performance: assessment to improve efficiency of evidence information retrieval.

Lead Research Organisation: Oxford University Hospitals NHS Trust
Department Name: Churchill Hospital

Abstract

NICE requires research to help it decide which treatments and diagnostic methods should be funded by the NHS. NICE has highlighted the need for research into the methods for reviewing evidence from research. Identifying research studies underpins most NICE reviewing. Methodological search filters are widely used to identify specific research designs such as randomized controlled trials or economic evaluations. Search filters are carefully selected collections of words and phrases used to search databases to identify research.

Little is known about how well search filters work in finding research. This means that their use as a standard tool for NICE researchers may not be informed by adequate information to indicate how search filters perform across different subjects, questions and databases.

Our proposal is to investigate how search filter performance can be measured and what measures are most useful to researchers. We also plan to investigate systems and approaches to provide better access to relevant and useful performance data on methodological search filters. The benefits of this research would be to enhance the tools and knowledge of the tools available to find research evidence to inform NICE appraisals, guidelines and other guidance. The research findings would also benefit other national technology assessment agencies, guidelines groups and related bodies.

Technical Summary

NICE has commissioned a scoping study from the MRC to identify NICE s methodological research needs. This highlighted the need for research into systematic review methods, evaluating diagnostic and screening technologies and conducting decision analysis. Efficient evidence retrieval is implicit within these priorities and underpins most NICE activity.

Evidence retrieval aims to provide appropriate volumes of relevant information within existing resource restraints, to ensure a robust dataset from which accurate estimates of parameters can be derived, and to minimize biases from incomplete retrieval.

One information retrieval tool used by HTA information professionals is the methodological search filter. It identifies specific study designs such as randomized controlled trials, economic evaluations and quality of life studies.

Search filters optimize retrieval from the perspective of the delicate balance between maximizing sensitivity (i.e. identifying as high a proportion as possible of relevant records) whilst achieving adequate precision (minimizing the number of irrelevant records to be assessed). Well-designed filters optimizing precision, whilst achieving acceptable sensitivity, ensure that limited resources are used efficiently by reducing the resources required to obtain irrelevant material resulting from literature searches.

Search filters are, however, an under-researched tool and are rarely validated beyond the data in their original publication. This means that their use as a tool for technology assessment, guideline development and evidence synthesis may be pragmatic rather than evidence-based.

As search filters proliferate, the key question becomes how to choose between them. Good progress has already been made by the project co-applicants in collecting search filters in a web site (http://www.york.ac.uk/inst/crd/intertasc/). Critical appraisal tools have also been developed by the group to assess search filters.

It is still unclear, however, how searchers choose between search filters. It is likely that search filter performance in terms of the sensitivity, precision or specificity of the filter for different types of searches may be important to informing choice, but we have little evidence on this from which to develop information for the web site and inform choice.

We propose to investigate the methods used to assess search filter performance and to explore what searchers require of search filters. We also propose to explore systems and approaches to provide better access to relevant and useful search filter performance data.

This research should enhance the evidence retrieval tools available to inform NICE appraisals, guidelines and other guidance, improve awareness of these tools and benefit other technology assessment agencies, guidelines groups and related bodies.

Publications

10 25 50