abstract
- BACKGROUND: Medical practitioners have unmet information needs. Health care research dissemination suffers from both "supply" and "demand" problems. One possible solution is to develop methodologic search filters ("hedges") to improve the retrieval of clinically relevant and scientifically sound study reports from bibliographic databases. To develop and test such filters a hand search of the literature was required to determine directly which articles should be retrieved, and which not retrieved. OBJECTIVE: To determine the extent to which 6 research associates can agree on the classification of articles according to explicit research criteria when hand searching the literature. DESIGN: Blinded, inter-rater reliability study. SETTING: Health Information Research Unit, McMaster University, Hamilton, Ontario, Canada. PARTICIPANTS: 6 research associates with extensive training and experience in research methods for health care research. MAIN OUTCOME MEASURE: Inter-rater reliability measured using the kappa statistic for multiple raters. RESULTS: After one year of intensive calibration exercises research staff were able to attain a level of agreement at least 80% greater than that expected by chance (kappa statistic) for all classes of articles. CONCLUSION: With extensive training multiple raters are able to attain a high level of agreement when classifying articles in a hand search of the literature.