Do Clinical Clerks Provide Candidates with Adequate Formative Assessment during Objective Structured Clinical Examinations?
Journal Articles
Overview
Research
Identity
Additional Document Info
View All
Overview
abstract
CONTEXT: Various research studies have examined the question of whether expert or non-expert raters, faculty or students, evaluators or standardized patients, give more reliable and valid summative assessments of performance on Objective Structured Clinical Examinations (OSCEs). Less studied has been the question of whether or not non-faculty raters can provide formative feedback that allows students to take advantage of the educational opportunity that OSCEs provide. This question is becoming increasingly important, however, as the strain on faculty resources increases. METHODS: A questionnaire was developed to assess the quality of feedback that medical examiners provide during OSCEs. It was pilot tested for reliability using video recordings of OSCE performances. The questionnaires were then used to evaluate the feedback given during an actual OSCE in which clinical clerks, residents, and faculty were used as examiners on two randomly selected test stations. RESULTS: The inter-rater reliability of the 19-item feedback questionnaire was 0.69 during the pilot test. The internal consistency was found to be 0.90 during pilot testing and 0.95 in the real OSCE. Using this form, the feedback ratings assigned to clinical clerks were significantly greater than those assigned to faculty evaluators. Furthermore, performance on the same OSCE stations eight months later was not impaired by having been evaluated by student examiners. DISCUSSION: While evidence of mark inflation within the clinical clerk examiners should be addressed with examiner training, the current results suggest that clerks are capable of giving adequate formative feedback to more junior colleagues.