posted on 2022-05-01, 00:00authored byLeizl Joy Nayahangan
A. Background
When simulation-based courses fail, there are at least two reasons: the content and structure were inadequate, or the quality of the implementation process was faulty. Evidence-based, stepwise approaches to ensure the development, delivery, and evaluation of simulation-based courses are well-established. However, best practices for implementation of these courses based on implementation science, are not widely known, or applied. The purpose of this study was to employ consensus building methodology to define content for a rubric, Implementation Quality Rubric for Simulation (IQR-SIM), to evaluate the implementation quality of simulation-based courses in health professions education.
B. Methods
A three-round, modified Delphi process involving international simulation and implementation experts was initiated to gather and converge opinions regarding criteria for evaluating the implementation quality of simulation-based courses. Candidate items for round 1 were developed based on the Adapted Implementation Model for Simulation (AIM-SIM). The items were revised and expanded to include descriptive anchors for evaluation in round 2. The criterion for inclusion was 70% of respondents selecting an importance rating of 4 or 5/5. Round 3 provided refinement and final approval of the content and descriptive anchors.
C. Results
Thirty-three experts from 9 countries participated. The initial rubric of 32 items was reduced to 18 items after three Delphi rounds, resulting in the IQR-SIM: a three-point rating scale, with non-scored options “Don’t know/can’t assess” and “Not applicable”, and a comment section.
D. Conclusion
The IQR-SIM is an operational tool that can be used to evaluate the implementation quality of simulation-based courses and aid in the implementation process to identify gaps, monitor the process, and promote the achievement of desired implementation and learning outcomes.