You are here:

Constraint-Weighted a-Stratification for Computerized Adaptive Testing with Nonstatistical Constraints: Balancing Measurement Efficiency and Exposure Control
ARTICLE

, , ,

Educational and Psychological Measurement Volume 69, Number 1, ISSN 0013-1644

Abstract

a-stratification is a method that utilizes items with small discrimination (a) parameters early in an exam and those with higher a values when more is learned about the ability parameter. It can achieve much better item usage than the maximum information criterion (MIC). To make a-stratification more practical and more widely applicable, a method for weighting the item selection process in a-stratification as a means of satisfying multiple test constraints is proposed. This method is studied in simulation against an analogous method without stratification as well as a-stratification using descending-rather than ascending-a procedures. In addition, a variation of a-stratification that allows for unbalanced usage of a parameters is included in the study to examine the trade-off between efficiency and exposure control. Finally, MIC and randomized item selection are included as baseline measures. Results indicate that the weighting mechanism successfully addresses the constraints, that stratification helps to a great extent balancing exposure rates, and that the ascending-a design improves measurement precision. (Contains 4 tables and 1 figure.)

Citation

Cheng, Y., Chang, H.H., Douglas, J. & Guo, F. (2009). Constraint-Weighted a-Stratification for Computerized Adaptive Testing with Nonstatistical Constraints: Balancing Measurement Efficiency and Exposure Control. Educational and Psychological Measurement, 69(1), 35-49. Retrieved April 21, 2021 from .

This record was imported from ERIC on April 19, 2013. [Original Record]

ERIC is sponsored by the Institute of Education Sciences (IES) of the U.S. Department of Education.

Copyright for this record is held by the content creator. For more details see ERIC's copyright policy.

Keywords