abstract
- STUDY OBJECTIVE: This study identified the most important quality indicators for online educational resources such as blogs and podcasts. METHODS: A modified Delphi process that included 2 iterative surveys was used to build expert consensus on a previously defined list of 151 quality indicators divided into 3 themes: credibility, content, and design. Aggregate social media indicators were used to identify an expert population of editors from a defined list of emergency medicine and critical care blogs and podcasts. Survey 1 consisted of the quality indicators and a 7-point Likert scale. The mean score for each quality indicator was included in survey 2, which asked participants whether to "include" or "not include" each quality indicator. The cut point for consensus was defined at greater than 70% "include." RESULTS: Eighty-three percent (20/24) of bloggers and 90.9% (20/22) of podcasters completed survey 1 and 90% (18/20) of bloggers and podcasters completed survey 2. The 70% inclusion criteria were met by 44 and 80 quality indicators for bloggers and podcasters, respectively. Post hoc, a 90% cutoff was used to identify a list of 14 and 26 quality indicators for bloggers and podcasters, respectively. CONCLUSION: The relative importance of quality indicators for emergency medicine blogs and podcasts was determined. This will be helpful for resource producers trying to improve their blogs or podcasts and for learners, educators, and academic leaders assessing their quality. These results will inform broader validation studies and attempts to develop user-friendly assessment instruments for these resources.