• Media type: E-Article
  • Title: Milestones for neurosurgery sub-interns: a novel evaluation tool to quantitatively differentiate residency applicants
  • Contributor: Bowden, Stephen G.; Tan, Hao; Rothbaum, Michael G.; Cook, Steven H.; Hanft, Simon; Heth, Jason; Morgenstern, Peter F.; Mullin, Jeffrey P.; Orina, Josiah N.; Wilson, Jonathan L.; Winer, Jesse L.; Wolfe, Stacey Q.; Chambless, Lola B.; Selden, Nathan R.
  • Published: Journal of Neurosurgery Publishing Group (JNSPG), 2023
  • Published in: Journal of Neurosurgery, 139 (2023) 6, Seite 1748-1756
  • Language: Not determined
  • DOI: 10.3171/2023.3.jns23259
  • ISSN: 0022-3085; 1933-0693
  • Origination:
  • Footnote:
  • Description: OBJECTIVEThe study objective was to create a novel milestones evaluation form for neurosurgery sub-interns and assess its potential as a quantitative and standardized performance assessment to compare potential residency applicants. In this pilot study, the authors aimed to determine the form’s interrater reliability, relationship to percentile assignments in the neurosurgery standardized letter of recommendation (SLOR), ability to quantitatively differentiate tiers of students, and ease of use.METHODSMedical student milestones were either adapted from the resident Neurological Surgery Milestones or created de novo to evaluate a student’s medical knowledge, procedural aptitude, professionalism, interpersonal and communication skills, and evidence-based practice and improvement. Four milestone levels were defined, corresponding to estimated 3rd-year medical student through 2nd-year resident levels. Faculty and resident evaluations as well as student self-evaluations were completed for 35 sub-interns across 8 programs. A cumulative milestone score (CMS) was computed for each student. Student CMSs were compared both within and between programs. Interrater reliability was determined with Kendall’s coefficient of concordance (Kendall’s W). Student CMSs were compared against their percentile assignments in the SLOR using analysis of variance with post hoc testing. CMS-derived percentile rankings were assigned to quantitatively distinguish tiers of students. Students and faculty were surveyed on the form’s usefulness.RESULTSThe average faculty rating overall was 3.20, similar to the estimated competency level of an intern. Student and faculty ratings were similar, whereas resident ratings were lower (p < 0.001). Students were rated most highly in coachability and feedback (3.49 and 3.67, respectively) and lowest in bedside procedural aptitude (2.90 and 2.85, respectively) in both faculty and self-evaluations. The median CMS was 26.5 (IQR 21.75–29.75, range 14–32) with only 2 students (5.7%) achieving the highest rating of 32. Programs that evaluated the most students differentiated the highest-performing students from the lowest by at least 13 points. A program with 3 faculty raters demonstrated scoring agreement across 5 students (p = 0.024). The CMS differed significantly between SLOR percentile assignments, despite 25% of students being assigned to the top fifth percentile. CMS-driven percentile assignment significantly differentiated the bottom, middle, and top third of students (p < 0.001). Faculty and students strongly endorsed the milestones form.CONCLUSIONSThe medical student milestones form was well received and differentiated neurosurgery sub-interns both within and across programs. This form has potential as a replacement for numerical Step 1 scoring as a standardized, quantitative performance assessment for neurosurgery residency applicants.