Differential Item Functioning in PISA Due to Mode Effects

Remco Feskens*, Jean Paul Fox, Robert Zwitser

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingChapterAcademicpeer-review

5 Citations (Scopus)
52 Downloads (Pure)

Abstract

One of the most important goals of the Programme for International Student Assessment (PISA) is assessing national changes in educational performance over time. These so-called trend results inform policy makers about the development of ability of 15-year-old students within a specific country. The validity of those trend results prescribes invariant test conditions. In the 2015 PISA survey, several alterations to the test administration were implemented, including a switch from paper-based assessments to computer-based assessments for most countries (OECD 2016a). This alteration of the assessment mode is examined by evaluating if the items used to assess trends are subject to differential item functioning across PISA surveys (2012 vs. 2015). Furthermore, the impact on the trend results due to the change in assessment mode of the Netherlands is assessed. The results show that the decrease reported for mathematics in the Netherlands is smaller when results are based upon a separate national calibration.

Original languageEnglish
Title of host publicationMethodology of Educational Measurement and Assessment
PublisherSpringer
Pages231-247
Number of pages17
ISBN (Electronic)978-3-030-18480-3
ISBN (Print)978-3-030-18479-7
DOIs
Publication statusPublished - 6 Jul 2019

Publication series

NameMethodology of Educational Measurement and Assessment
ISSN (Print)2367-170X
ISSN (Electronic)2367-1718

Keywords

  • Differential item functioning
  • Educational assessment
  • Mode of administration
  • PISA

Fingerprint

Dive into the research topics of 'Differential Item Functioning in PISA Due to Mode Effects'. Together they form a unique fingerprint.

Cite this