<p>Meta-Analyses Proved Inconsistent in How Missing Data Were Handled Across Their Included Primary Trials: A Methodological Survey</p> Journal Articles uri icon

  •  
  • Overview
  •  
  • Research
  •  
  • Identity
  •  
  • Additional Document Info
  •  
  • View All
  •  

abstract

  • BACKGROUND: How systematic review authors address missing data among eligible primary studies remains uncertain. OBJECTIVE: To assess whether systematic review authors are consistent in the way they handle missing data, both across trials included in the same meta-analysis, and with their reported methods. METHODS: We first identified 100 eligible systematic reviews that included a statistically significant meta-analysis of a patient-important dichotomous efficacy outcome. Then, we successfully retrieved 638 of the 653 trials included in these systematic reviews' meta-analyses. From each trial report, we extracted statistical data used in the analysis of the outcome of interest to compare with the data used in the meta-analysis. First, we used these comparisons to classify the "analytical method actually used" for handling missing data by the systematic review authors for each included trial. Second, we assessed whether systematic reviews explicitly reported their analytical method of handling missing data. Third, we calculated the proportion of systematic reviews that were consistent in their "analytical method actually used" across trials included in the same meta-analysis. Fourth, among systematic reviews that were consistent in the "analytical method actually used" across trials and explicitly reported on a method for handling missing data, we assessed whether the "analytical method actually used" and the reported methods were consistent. RESULTS: We were unable to determine the "analytical method reviews actually used" for handling missing outcome data among 397 trials. Among the remaining 241, systematic review authors most commonly conducted "complete case analysis" (n=128, 53%) or assumed "none of the participants with missing data had the event of interest" (n=58, 24%). Only eight of 100 systematic reviews were consistent in their approach to handling missing data across included trials, but none of these reported methods for handling missing data. Among seven reviews that did explicitly report their analytical method of handling missing data, only one was consistent in their approach across included trials (using complete case analysis), and their approach was inconsistent with their reported methods (assumed all participants with missing data had the event). CONCLUSION: The majority of systematic review authors were inconsistent in their approach towards reporting and handling missing outcome data across eligible primary trials, and most did not explicitly report their methods to handle missing data. Systematic review authors should clearly identify missing outcome data among their eligible trials, specify an approach for handling missing data in their analyses, and apply their approach consistently across all primary trials.

publication date

  • 2020