Does it matter whom and how you ask? Inter- and intra-rater agreement in the Ontario Health Survey Journal Articles uri icon

  •  
  • Overview
  •  
  • Research
  •  
  • Identity
  •  
  • Additional Document Info
  •  
  • View All
  •  

abstract

  • A large amount of information in the 1990 Ontario Health Survey (OHS) was collected from proxy respondents using questions administered in face-to-face interviews. Can this type of information represent candid self-reported measures of health status? Inter-rater agreement was assessed using Cohen's kappa statistic for responses to questions that were answered both by individuals about themselves and by proxies on their behalf. Intra-rater agreement, assessing the effect of mode of survey administration (in-person interviews versus self-completed written questionnaires) on the responses, was also investigated using the kappa statistic. We conclude that: (1) proxy responses in the OHS for impairments of emotion and pain are not reliable indicators of self-response (kappa < 0.32) because proxy respondents consistently under-report the burden of morbidity; (2) levels of morbidity reported by subjects to interviewer-administered questionnaires may underestimate morbidity, relative to morbidity reported by subjects using self-administered questionnaires completed in privacy. We also hypothesize that the relative magnitudes of inaccuracy introduced by interviewer administration relative to proxy reporting depends on the phenomenon being measured. When assessing pain, mode of administration is quantitatively a more important source of disagreement than type of respondent.

publication date

  • February 1997