An Investigation of Ordering Test Items Differently Depending on Their Difficulty Level by Differential Item Functioning

Ebru BALTA1 Secil OMUR SUNBUL2
1Ankara University, Faculty of Educational Science, Department of Measurement and Evaluation, TURKEY.
2Mersin University, Faculty of Educational Science, Department of Measurement and Evaluation, TURKEY.
DOI: 10.14689/ejer.2017.72.2

ABSTRACT

Purpose: Position effects may influence examinees’ test performances in several ways and trigger other psychometric issues, such as Differential Item Functioning (DIF) .This study aims to supply test forms in which items in the test are ordered differently, depending on their difficulty level (from easy to difficult or difficult to easy), to determine whether the items in the test form result in DIF and whether a consistency exists between the methods for detecting DIF.

Research Methods: Methods of Mantel Haenszel (MH) and Logistic Regression (LR) have been taken into consideration to identify whether the items in the tests involve DIF. The data of the work includes the answers of 300 students in the focal group and the reference group, who sat for three mathematics achievement tests. The data obtained from the tests have been statistically analyzed by using the R- 3.2.0. software program.

Findings: Results of this study can be summarized with the following findings: “ordering the items differently, depending on their difficulty level, affects the probability of individuals in various groups answering the items correctly; also, LR and MH methods produce different results with respect to the items with DIF, which they have identified similar in terms of magnitude order in the amount of DIF.

Implications for Research and Practice: In further test-developing studies, in order to identify if DIF emerges when giving the test form which has a different ordering of items, with regard to subjects and cognitive difficulty levels.

Keywords: Item Orderings, Mantel-Haenszel, Logistic Regression, Moodle.