A many‐facet Rasch measurement model approach to investigating objective structured clinical examination item parameter drift Journal Articles uri icon

  •  
  • Overview
  •  
  • Research
  •  
  • Identity
  •  
  • View All
  •  

abstract

  • AbstractRationaleObjective Structured Clinical Examinations (OSCEs) are widely used for assessing clinical competence, especially in high‐stakes environments such as medical licensure. However, the reuse of OSCE cases across multiple administrations raises concerns about parameter stability, known as item parameter drift (IPD).Aims & ObjectivesThis study aims to investigate IPD in reused OSCE cases while accounting for examiner scoring effects using a Many‐facet Rasch Measurement (MFRM) model.MethodData from 12 OSCE cases, reused over seven administrations of the Internationally Educated Nurse Competency Assessment Program (IENCAP), were analyzed using the MFRM model. Each case was treated as an item, and examiner scoring effects were accounted for in the analysis.ResultsThe results indicated that despite accounting for examiner effects, all cases exhibited some level of IPD, with an average absolute IPD of 0.21 logits. Three cases showed positive directional trends. IPD significantly affected score decisions in 1.19% of estimates, at an invariance violation of 0.58 logits.ConclusionThese findings suggest that while OSCE cases demonstrate sufficient stability for reuse, continuous monitoring is essential to ensure the accuracy of score interpretations and decisions. The study provides an objective threshold for detecting concerning levels of IPD and underscores the importance of addressing examiner scoring effects in OSCE assessments. The MFRM model offers a robust framework for tracking and mitigating IPD, contributing to the validity and reliability of OSCEs in evaluating clinical competence.

publication date

  • July 29, 2024