/ 7 December 2012

Vast improvements in pupils’ national test results ‘not possible’

The improvements of this year's Annual National Assessment results compared to last year cannot possibly be accurate as this would mean South Africa's education system has improved more in one year than Colombia did in 12 years.
The improvements of this year's Annual National Assessment results compared to last year cannot possibly be accurate as this would mean South Africa's education system has improved more in one year than Colombia did in 12 years.

Education experts have sounded an alarm bell over the basic education department's claim of "great improvements" in the annual national assessment results released this week.

The assessments measure numeracy and literacy and this year tested over seven-million pupils in grades one to six and grade nine.

Last year was the first time the results were made public and the country learned about the shocking statistics they held – a national average mark in literacy of 35% among grade threes, and 30% in maths among grade sixes.

On Monday the basic education department released a report of this year's results, which while still bleak, nevertheless showed some "noticeable increases". Grade three literacy increased to 52% and numeracy in the grade increased from 28% to 41%, the report said.

But these improvements are impossible and are misleading the public, experts in University of Stellenbosch's economics department, Servaas van der Berg and Nicholas Spaull, told the Mail & Guardian this week.

Do you think the 2012 results can be fairly compared to the 2011 results?
The results for 2011 and 2012 should not be compared.

It is only possible to draw valid comparisons between two tests if they are of equal difficulty but it is unclear from the report what measures were taken to ensure similar difficulty levels across years.

Although the report mentioned using common test items, we don't know if these were used to scale the tests to the same difficulty level.

If this was done at all, if could only have been for mathematics, as page 13 of the report explains: "An attempt was made to link the 2012 test to the 2011 test by repeating a few of the items from 2011 in the 2012 test. This was, however, only possible for some grades, and specifically with mathematics. Language tests for grades four to six could not link items from 2011 given that the language tests in 2011 were not demarcated into home language and first additional language".

If the tests were not converted to have the same difficulty level, they cannot validly be compared. Furthermore, the 2011 assessment report explains that the Human Sciences Research Council (HSRC) created the database of item-level responses only for grade three and grade six, thus it would have been possible to do equating for these grades only if this could have been done at all.

The large magnitude of the changes between these two years makes one skeptical that the tests were measuring at anything near the same difficulty level and that the changes documented in the report are plausible.

In some primary school grades there were very big improvements from 2011 to 2012. For example, the average mark for literacy in grade three increased from 35% in 2011 to 52% in 2012. Are these results plausible?
We found this particularly strange. All the available evidence suggests that changes of this magnitude are simply not possible, locally or internationally.

It may help to provide some background information on the relative magnitude of these increases.

The grade three literacy improvemed by 17 percentage points year-on-year. If one compares this to the largest improvers around the world, it would mean that South Africa has the fastest improving educational system in the world. If these results were true it would mean we have improved more in a single year than Colombia did in 12 years from 1995 to 2007 – and Colombia was the fastest improving country of the 67 countries tested in "Trends in international mathematics and science study" for this period.

Looking at a different survey, this would mean we have improved more in a single year than Russia did over the 2001 to 2006 period in the "Progress in international reading and literacy study" – and Russia experienced the largest increase in student achievement of the 28 countries tested in the study over this period. This is simply not possible.

One could also use local comparisons to provide a sense-check to the alleged assessment improvements.

Every year the Western Cape conducts systemic evaluations of grade three and grade six pupils. Between 2011 and 2012 there was almost no improvement in these results in the Western Cape, yet according to the assessment's results the Western Cape improved by 14 percentage points. Given that these tests are calibrated to be of equal difficulty year-on-year, and that they are marked centrally, they are currently a more reliable indicator of true progress in learning than the assessments and provide strong evidence that the national assessments are exaggerating any improvement that there may have been in learning in our schools.

Apart from international and local comparisons, the results for this year's assesments do not appear internally consistent. If the results were calibrated to be of similar difficulty in each grade, which is necessary to be able to compare them, then how is it possible that the grade one mathematics average in 2012 was 68% but the grade three average was only 41%, just two grades later? The performance further deteriorates to 27% in grade six and a dismal 13% in grade nine. Are these tests of equal difficulty for their grade? If so, it would indicate much better performance in the lower than the higher grades.

Yet it would seem that there was no inter-grade linking of items, which is necessary to ensure that difficulty levels are similar.

This is made explicit in the report: "There was no deliberate attempt to include questions to assess the degree to which the assessment standards of earlier grades had been achieved". Thus one cannot compare the results of one grade with the next, or say that performance is deteriorating as the grades progress.

What are your comments on the verification process of this year's results?
From the report it would seem that there was no external, independent verification process for this year's assessments. In 2011 the HSRC verified the results of grade three and grade six by re-marking a nationally representative sample of schools. This, year, the plan was that a different verification process would be followed, but we don't know what actually transpired.

Why are the national assessments important?
The assessments are an important and worthwhile endeavour and are needed to improve the quality of education in South Africa. The introduction of these tests is one of the most important advances in educational policy in recent years as it provides a source of information for teachers, students, parents and policy makers that was absent before. Without a testing system like the national assessments, it is not possible to determine which schools need what help, or to allow us to diagnose and remediate learning problems early enough such that they do not become insurmountable deficits. The national assessments provide information to teachers about the level they should assess at, and the level of cognitive demand they should aim at. It can provide objective feedback to parents about their children's performance, which is essential for them to know how the school system serves them and what learning deficits they may have. Parents and children have a right to know this, and poor and illiterate parents doubly so.

The real problem in our system is the failure of most students to master foundational numeracy and literacy skills in primary school, which then spills over into secondary schools. However, for the national assessments to provide the information on performance in schools, they need to be reliable indicators of learning across grades and over time. To this end the basic education department should put in place an independent verification process, and tests should adhere to international guidelines for standardised testing. The fact that national assessments' results from 2011 and 2012 are incomparable is highly unfortunate. This means that schools, teachers and parents are getting erroneous feedback. Thus the 2012 national assessment results, compared to those of 2011, creates an impression of a remarkable improvement in school performance which did not really occur.