Using an Enhancement-Focused Framework Model to Improve the Q ...

Work thumb

Views: 315

All Rights Reserved

Copyright © 2020, Common Ground Research Networks, All Rights Reserved

Abstract

Producing a robust multiple-choice question examination (MCQ) for high-stakes assessment requires significant time and effort to ensure reliability and validity. The enhancement-focused framework recommends stepwise processes to produce a high-quality assessment. The processes, consisting of blueprinting, vetting, standard setting, external review, and student orientation, were used to develop and administer the final year medical school multiple-choice question examination. Item analysis was performed, question psychometrics were interpreted, and the locally developed examination was compared with the external International Foundations of Medicine Clinical Science Examination. The examination cut score was 55.08 percent and the mean score was 62.04 percent (SD ±8.27). Internal consistency (Cronbach’s Alpha Reliability measure) was 0.759. 57 percent of questions had a discriminatory index > 0.2 and more than 85 percent had a difficulty index > 0.30. Comparison of the locally developed MCQ examination with the external examination resulted in a Pearson’s correlation coefficient of 0.81 validating the locally developed MCQ examination as a high-quality assessment tool. Despite availability of multiple contemporary assessment methods, MCQs remain a popular, valid, and reliable summative assessment tool in higher education. The enhancement-focused framework provides a strong foundation to develop a high-stake MCQ examination. However, continued rigorous scrutiny of all test items and the refinement of the individual processes such as blueprinting, vetting, standard setting, and external review is imperative to enhance their quality, and this process is a continuous cycle enriching locally produced question banks.