Completeness of reporting of simulation studies on responder analysis methods and simulation performance: a methodological survey. Journal Articles uri icon

  •  
  • Overview
  •  
  • Research
  •  
  • Identity
  •  
  • Additional Document Info
  •  
  • View All
  •  

abstract

  • OBJECTIVES: To evaluate the completeness of reporting of simulation studies on responder analysis methods and simulation performance. DESIGN: Systematic methodological survey. DATA SOURCES: We searched Embase, MEDLINE (via Ovid), PubMed and Web of Science Core Collection from inception to 9 October 2023. ELIGIBILITY CRITERIA: We included simulation studies comparing responder analysis methods and assessing simulation performance (bias, accuracy, precision or variance, power, type I and II errors and coverage). DATA EXTRACTION AND SYNTHESIS: Two independent reviewers extracted data and assessed simulation performance. We used descriptive analyses to summarise reporting quality and simulation performance. RESULTS: We identified seven simulation studies exploring augmented binary methods, distributional methods and model-based methods. No studies reported the starting seed, occurrence of failures during simulations, the random number generator used and the number of simulations. No studies reported simulation accuracy. Responder analysis results were not significantly influenced by covariate adjustment. Distributional methods remained adaptable even with skewed data. Compared with standard binary methods, augmented binary methods generated increased power and precision. When the threshold is in the tail of the distribution, a simple asymptotic Bayesian (SAB) distributional approach may not reduce uncertainty but can improve precision. CONCLUSION: Simulation studies comparing responder analysis methods exhibit suboptimal reporting quality. Compared with standard binary methods, augmented binary methods, distributional methods and model-based methods may be better choices, but there is no best one.

publication date

  • May 23, 2025