The inter-rater and intra-rater reliability of a new Canadian oral examination format in anesthesia is fair to good Academic Article uri icon

  •  
  • Overview
  •  
  • Research
  •  
  • Identity
  •  
  • Additional Document Info
  •  
  • View All
  •  

abstract

  • PURPOSE: In response to the Royal College's request to improve the validity and reliability of oral examinations, the Examination Board in anesthesia proposed a structured oral examination format. Prior to its introduction, we studied this format in two residency programs to determine reliability of the examiners. METHODS: Twenty faculty and 26 residents from two Canadian residency programs participated (Sites A and B). Pairs of examiners scored five or six residents examined consecutively on two standardized questions using global rating scales with anchored performance criteria. Residents' performances were scored independently during the examination (Time 1) and later from a videotaped recording (Time 2). Correlations between scores of the pairs of examiners and between scores of each examiner were determined. RESULTS: Correlations demonstrating inter-rater agreement between examiners at Site A ranged from -.324 to.915 (mean.506) at Time 1. At Time 2, correlations ranged from.64 to.887 (mean.791). At Site B correlations ranged from.279 to.989 (mean.707) at Time 1 and at Time 2 correlations ranged from -.271 to.924 (mean.477). Correlations demonstrating intra-rater agreement of examiners at Site A ranged from.054 to.983 (mean.723) and at Site B correlations ranged from -.055 to.974 (mean.662). Correlations >0.4 were seen in 80% of the scores and >0.7 in 50% indicating fair to good intra-rater and inter-rater reliability using this format. CONCLUSIONS: Despite the limitations of our study our results compare favourably with those previously reported in anesthesia. We recommend the adoption of this format to the Royal College of Physicians and Surgeons of Canada Examination Board.

publication date

  • March 2002