• Media type: E-Article
  • Title: Measuring the quality of an objective structured clinical examination in a chiropractic program: A review of metrics and recommendations
  • Contributor: Cade, Alice E.; Mueller, Nimrod
  • imprint: Brighthall, 2024
  • Published in: Journal of Chiropractic Education
  • Language: English
  • DOI: 10.7899/jce-22-29
  • ISSN: 2374-250X; 1042-5055
  • Keywords: Chiropractics
  • Origination:
  • Footnote:
  • Description: <jats:title>ABSTRACT</jats:title> <jats:sec> <jats:title>Objective</jats:title> <jats:p>The objective structured clinical examination (OSCE) is a commonly used assessment of clinical skill. Ensuring the quality and reliability of OSCEs is a complex and ongoing process. This paper discusses scoring schemas and reviews checklists and global rating scales (GRS) for marking. Also detailed are post-examination quality assurance metrics tailored to smaller cohorts, with an illustrative dataset.</jats:p> </jats:sec> <jats:sec> <jats:title>Methods</jats:title> <jats:p>A deidentified OSCE dataset, from stations with a checklist and GRS, of 24 examinees from a 2021 cohort was assessed using the following metrics: Cut scores or pass rates, number of failures, R2, intergrade discrimination, and between-group variation. The results were used to inform a set of implementable recommendations to improve future OSCEs.</jats:p> </jats:sec> <jats:sec> <jats:title>Results</jats:title> <jats:p>For most stations, the cut score calculated was higher than the traditional pass of 50% (58.9.8%–68.4%). The number of failures was low for traditional pass rates and cut scores (0.00–16.7%), except lab analysis where number of failures was 50.0%. R2 values ranged from 0.67–0.97, but the proportion of total variance was high (67.3–95.9). These data suggest there were potential missed teaching concepts, that station marking was open to examiner interpretation, and there were inconsistencies in examiner marking. Recommendations included increasing examiner training, using GRSs specific to each station, and reviewing all future OSCEs with the metrics described to guide refinements.</jats:p> </jats:sec> <jats:sec> <jats:title>Conclusion</jats:title> <jats:p>The analysis used revealed several potential issues with the OSCE assessment. These findings informed recommendations to improve the quality of our future examinations.</jats:p> </jats:sec>
  • Access State: Open Access