Modeling Science Achievement in South Africa Using International Achievement Data: Some Comparability Validity Issues

Modeling Science Achievement in South Africa Using International Achievement Data: Some Comparability Validity Issues

  • Version 1.0.0
  • Download 0
  • File Size 327.21 KB
  • File Count 1
  • Create Date August 2, 2018
  • Last Updated August 2, 2018

Modeling Science Achievement in South Africa Using International Achievement Data: Some Comparability Validity Issues

International comparative studies of achievement facilitate national reflection on educational practice and experience. Such studies use sophisticated assessment designs, detailed questionnaires, complex scaling methodologies and stringent quality control systems to produce comparable data. These data enable international ranking of countries based on a common assessment framework, the identification of indicator variables, and the modeling of achievement. At the heart of these studies is a belief that the assessment enables valid comparisons of achievement. Acknowledging that the international assessment include items that may not be covered by a specific national curriculum, the organizers argue that the removal of a small number of items unsuitable for any particular country will not appreciably change achievement scores, international ranking, or achievement model. This study investigates this claim using grade 8 science achievement TIMSS 2003 for South Africa within a comparative validity framework using construct, scale and measurement equivalence.We performed a preliminary investigation of construct equivalence. A panel of South African science education experts reviewed the TIMSS 2003 science items, identifying items that were suitable for South Africa and those that were not. Based upon the assumption that TIMSS items are scaled appropriately, we used the published TIMSS item response theory item parameters to produce scale scores for students using all the TIMSS science items and then only those items deemed appropriate for South Africa. As expected South African science achievement scores increased as the match between TIMSS science items and the intended South African curriculum became closer. However, this increase was small, approximately one‐,quarter of a standard deviation. We acknowledge that the deletion of a substantial number of science items changes the assessment framework but argue that South African achievement is more meaningful in the context of what is intended to be taught to South African students. ,We then investigated scale and measurement equivalence within the item response theory model used by TIMSS. Our analyses showed that there is a preponderance of item misfit, indicating a lack of scale and measurement equivalence. While undoubtedly South African students performed poorly on the TIMSS 2003 science items, we caution against using the TIMSS scales to compare South African students to students in other countries.

Attached Files

FileAction
paper_4d52851d.PDFDownload 
Menu
X