Abstract
In TIMSS-95, participating countries could administer the TIMSS Performance Assessment consisting of practical tasks, and considered to match well with the Dutch intended curriculum. But in 1995, Dutch students did not score as expected on this test, revealing a discrepancy between intended and attained curriculum. Therefore, in 2000, the test was replicated. Results show an increased teachers' acceptance of the test, but – still – no significant gain in Dutch students' achievements. Additionally, if reliability is well controlled, the study revealed that there are valid mathematics assessment alternatives, which can supplement paper-and-pencil tests, not only in The Netherlands.
| Original language | English |
|---|---|
| Pages (from-to) | 141-154 |
| Journal | Educational research and evaluation |
| Volume | 11 |
| Issue number | 2 |
| DOIs | |
| Publication status | Published - 2005 |
UN SDGs
This output contributes to the following UN Sustainable Development Goals (SDGs)
-
SDG 4 Quality Education
Fingerprint
Dive into the research topics of 'Trends (1995-2000) in the TIMSS mathematics performance assessment in the Netherlands'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver