Abstract: | This empirical study was designed to determine the impact of computerized adap- tive test (CAT) administration formats on student performance. Students in medical technology programs took a paper-and-pencil and an individualized, computerized adaptive test. Students were randomly assigned to adaptive test administration for- mats to ascertain the effect on student performance of altering: (a) the difficulty of the first item, (b) the targeted level of test difficulty, (c) minimum test length, and (d) the opportunity to control the test. Computerized adaptive test data were analyzed with ANCO VA. The paper-and.pencil test was used as a covariate to equalize abil- ity variance among cells. The only significant main effect was for opportunity to control the test. There were no significant interactions among test administration formats. This study provides evidence concerning adjusting traditional computer- ized adaptive testing to more familiar testing modalities. |