Medical-School Performance Metrics that Predict High Stakes Clinical-Skills Examination Failure
thesis
posted on 2024-08-01, 00:00authored byEmily Hall
Purpose: Medical schools must assess students’ mastery of key clinical skills (CS) to identify those who are ready to progress and those who need additional training. Local performance metrics previously associated with United States Medical Licensing Examination (USMLE) Step 2 CS examination failure can suggest effective predictors of students struggling to master clinical skills.
Methods: A retrospective analysis compared the performance of University of Illinois College of Medicine (Chicago) students who sat for Step 2 CS from 2013 to 2019 and who passed (n=463) and failed (n=43). Potential predictor variables included medical knowledge-focused metrics (USMLE Step 1 scores, National Board of Medical Examiners shelf examination scores) and clinical-skills-focused metrics (performance on a preclinical doctoring course; clinical performance on clerkships; and data gathering, communication, and patient note performance sub-scores on an end-of-M3 summative OSCE).
Results: A logistic regression model demonstrated that two variables, the preclinical doctoring course composite grade and the patient note subscore of the M3 OSCE, were significantly associated with Step 2 CS failure. Shelf examination scores and the communication rating of the M3 OSCE approached statistical significance. Clinical clerkship grades did not predict Step 2 CS failure. An ROC analysis of the multivariable model showed that 72% of failing students could be identified with 76% specificity.
Conclusions: Preclinical doctoring course grades can serve as an early identifier of students who need additional coaching of clinical skills, and in combination with OSCE scores after clerkships can be used to support summative decisions regarding the achievement of clinical skills. Although such programs can be expensive and logistically complex, these are better predictors of struggling students than board scores or clerkship grades. Faculty development or a cadre of expert faculty assessors may be needed to improve the quality and validity of clerkship workplace-based assessments sufficiently to identify failing students.
History
Advisor
Rachel Yudkowsky
Department
Medical Education
Degree Grantor
University of Illinois Chicago
Degree Level
Masters
Degree name
MHPE, Master of Health Professions Education
Committee Member
A
l
a
n
S
c
h
w
a
r
t
z
,
J
e
f
f
r
e
y
C
h
e
u
n
g