Validity and Feasibility of the Electromyography Direct Observation Tool (EMG-DOT)
Leep Hunderfund, Andrea
MetadataShow full item record
Objective – To develop a new workplace-based electromyography direct observation tool (the EMG-DOT) and gather validity evidence supporting its use for assessing electrodiagnostic skills among post-graduate medical trainees. Methods – The EMG-DOT was developed by experts using an iterative process. Validity evidence from content, response process, internal structure, relations to other variables, and consequences of testing was prospectively collected during the 2013-2014 academic year. Both supervising physicians and nerve conduction study technicians served as raters. Results – The 14-item EMG-DOT had a high content validity index (0.94) and excellent internal-consistency reliability (Cronbach alpha 0.94). Mean (SD) performance ratings assigned by physician and technician raters were 3.89 (0.79) and 3.57 (0.85) respectively using a five-point, competency-based rating scale (p<0.001). Correlations between individual items and global ratings of trainee performance ranged from 0.36 to 0.76 (p<0.001). Mean scores increased from 70 to 81% over the course of the EMG rotation (p<0.001) despite a corresponding increase in case complexity. Trainees reported that the observational assessment exercise was useful for improving their knowledge or skills in 82% of encounters (188/230) and that feedback generated by the EMG-DOT improved the quality of care provided to patients in 58% (133/230). Trainees were “satisfied” or “very satisfied” with the observational assessment exercise in 96% of encounters (234/243). Conclusions – This study provides validity evidence supporting the use of EMG-DOT scores to assess electrodiagnostic skills of residents and fellows. The EMG-DOT can be used to inform milestone-based assessments of trainee performance in Neurology, Child Neurology, Physical Medicine and Rehabilitation, Neuromuscular, and Clinical Neurophysiology training programs.