Case Report
BibTex RIS Cite

Creating Parallel Forms to Support On-Demand Testing for Undergraduate Students in Psychology

Year 2017, Volume: 8 Issue: 3, 288 - 302, 30.09.2017
https://doi.org/10.21031/epod.305350

Abstract

On-demanding testing requires that multiple forms of an exam be
administered to students in each testing session.  But the use of multiple forms raises test
security concern because of item exposure. 
One way to limit exposure is with parallel forms construction.  Parallel forms are different versions of a test
that measure the same content areas and have the same difficulty level but
contain different sets of items.  The
purpose of our study is to describe and demonstrate how parallel forms can be
created even from small test banks.  We
present three unique yet plausible test assembly problems.  We also provide a solution for each problem
using the results from a free, open-source, software add-in for Microsoft Excel
called the Opensolver.  Implications for
test design and item development are discussed.

References

  • Breithaupt, K., & Hare, D. (2016). Automated test assembly. In F. Drasgow (Ed.), Technology and testing: Improving educational and psychological measurement (pp. 128-141). New York: Routledge.
  • Daniels, L., & Gierl, M. J. (in press). The impact of immediate test score reporting on university students‘ achievement emotions in the context of computer-based multiple-choice exams. Learning and Instruction.
  • Drasgow, F. (2016). Technology and testing: Improving educational and psychological measurement. New York: Routledge.
  • Drasgow, F., Luecht, R. M., & Bennett, R. (2006). Technology and testing. In R. L. Brennan (Ed.), Educational Measurement (4th ed., pp. 471-516). Washington, DC: American Council on Education.
  • Luecht, R. M. (1998). Computer-assisted test assembly using optimization heuristics. Applied Psychological Measurement, 22, 224-236.
  • Luecht, R. M. (2016). Computer-based test delivery models, data, and operational implementation issues. In F. Drasgow (Ed.), Technology and testing: Improving educational and psychological measurement (pp. 179-205). New York: Routledge.
  • Sireci, S., & Zenisky, A. (2016). Computerized innovative item formats: Achievement and credentialing. In S. Lane, M. Raymond, & T. Haladyna (Eds.), Handbook of Test Development (2nd edition, pp. 313-334). New York: Routledge.
  • van der Linden, W. J. (1998). Optimal assembly of psychological and educational tests. Applied Psychological Measurement, 22, 195-211.
  • van der Linden, W. J. (2005). Linear models for optimal test design. New York: Springer.

Creating Parallel Forms to Support On-Demand Testing for Undergraduate Students in Psychology

Year 2017, Volume: 8 Issue: 3, 288 - 302, 30.09.2017
https://doi.org/10.21031/epod.305350

Abstract

On-demand testing requires that multiple
forms of an exam should be administered to students in each testing
session.  However, the use of multiple
forms raises test security concern because of item exposure.  One way to limit exposure is using parallel
forms construction.  Parallel forms are
different versions of a test that measure the same content areas and have the
same difficulty level but contain different sets of items.  The purpose of this study is to describe and
demonstrate how parallel forms can be created from a small item bank using the
selected-response item type.  We present
three unique yet plausible test assembly problems.  We also provide a solution for each problem
using the results from a free, open-source, software add-in for Microsoft Excel
called the Opensolver.  Implications for
test design and item development are discussed.

References

  • Breithaupt, K., & Hare, D. (2016). Automated test assembly. In F. Drasgow (Ed.), Technology and testing: Improving educational and psychological measurement (pp. 128-141). New York: Routledge.
  • Daniels, L., & Gierl, M. J. (in press). The impact of immediate test score reporting on university students‘ achievement emotions in the context of computer-based multiple-choice exams. Learning and Instruction.
  • Drasgow, F. (2016). Technology and testing: Improving educational and psychological measurement. New York: Routledge.
  • Drasgow, F., Luecht, R. M., & Bennett, R. (2006). Technology and testing. In R. L. Brennan (Ed.), Educational Measurement (4th ed., pp. 471-516). Washington, DC: American Council on Education.
  • Luecht, R. M. (1998). Computer-assisted test assembly using optimization heuristics. Applied Psychological Measurement, 22, 224-236.
  • Luecht, R. M. (2016). Computer-based test delivery models, data, and operational implementation issues. In F. Drasgow (Ed.), Technology and testing: Improving educational and psychological measurement (pp. 179-205). New York: Routledge.
  • Sireci, S., & Zenisky, A. (2016). Computerized innovative item formats: Achievement and credentialing. In S. Lane, M. Raymond, & T. Haladyna (Eds.), Handbook of Test Development (2nd edition, pp. 313-334). New York: Routledge.
  • van der Linden, W. J. (1998). Optimal assembly of psychological and educational tests. Applied Psychological Measurement, 22, 195-211.
  • van der Linden, W. J. (2005). Linear models for optimal test design. New York: Springer.
There are 9 citations in total.

Details

Journal Section Articles
Authors

Mark Gierl 0000-0002-2653-1761

Lia Daniels 0000-0001-9202-2538

Xinxin Zhang This is me 0000-0003-4926-7980

Publication Date September 30, 2017
Acceptance Date August 24, 2017
Published in Issue Year 2017 Volume: 8 Issue: 3

Cite

APA Gierl, M., Daniels, L., & Zhang, X. (2017). Creating Parallel Forms to Support On-Demand Testing for Undergraduate Students in Psychology. Journal of Measurement and Evaluation in Education and Psychology, 8(3), 288-302. https://doi.org/10.21031/epod.305350

Cited By


Building On-Demand Test Forms in R
Eğitimde ve Psikolojide Ölçme ve Değerlendirme Dergisi
Halil Sarı
https://doi.org/10.21031/epod.521330