Investigating Patterns of Rasch Item Estimates in Examinations of Foundation English Courses : A Gradient Assumption Examined

Abstract

Rasch modeling has long been an approach for educational measurement. Nonetheless, a strategic review of recent Educational Researcher journal volumes reveals little use of the approach, which may suggest its niche recognition. This article investigates patterns of Rasch item estimates in examinations of foundation English courses at a university in Thailand. Not only could this draw attention of the academia to the approach, but it also potentially touches upon a little-researched area in program evaluation. It is hypothesized that the test items of the English examinations are of incremental difficulty, from the first prerequisite to the last course in the series of foundation English courses. Multiple Rasch analyses are performed on item responses, tackling a rebuttal to the validity argument. A key finding, however, is that the results are mixed: the examinations of the courses are not always aligned in accordance with the expected gradient difficulty. Implications for the use of the approach and for the finding are also discussed. These include a call for more studies with Rasch modeling, and a call for scrutinizing courses and examinations that are administered as a series. A framework for dealing with null results is also advocated, whereby such findings ought to be articulated with clarity.

Presenters

Kunlaphak Kongsuwannakul
Lecturer, Suranaree University of Technology, Nakhon Ratchasima, Thailand

Details

Presentation Type

Paper Presentation in a Themed Session

Theme

Educational Studies

KEYWORDS

Rasch Models, Program Evaluation, Item Responses, Final Examinations, Foundation English

Digital Media

This presenter hasn’t added media.
Request media and follow this presentation.