/ 7 December 2012

Improved annual national assessment results impossible, say academics

The grade three outcomes do not make sense.
The grade three outcomes do not make sense.

Education experts have raised a red flag over this week's celebrated annual national assessments results, saying that vast improvements of this magnitude are both implausible and misleading to the public.

The assessments were piloted in 2008 to measure numeracy and literacy and this year tested more than seven million pupils in grades one to six and grade nine.

"All the available evidence suggests that changes of this magnitude are simply not possible, locally or internationally," University of Stellenbosch economists Servaas van der Berg and Nicholas Spaull told the Mail & Guardian in an email response to questions.

"If one compares some of these improvements to the largest improvers around the world, it would mean that South Africa has the fastest-improving educational system in the world," the experts said.

This is the second year the results have been made public.

Last year South Africa applauded Basic Education Minister Angie Motshekga's decision to release the results because they are important as a diagnostic tool for addressing  the system's ills. But it was the dismal picture the results painted — a national average mark in literacy of 35% among grade threes and 30% in maths among grade sixes — that became the central focus.

On Monday this week the basic education department released a report of 2012's much anticipated results, which, although still bleak, showed some "noticeable increases". Grade three literacy increased from 35% to 52% and numeracy from 28% to 41%, the report said.

But education experts were skeptical about the plausibility of such pronounced improvements and cited numerous recent education studies that could be used to do a "sense check".

"If these results were true, it would mean we have improved more in a single year than Colombia did in 12 years from 1995 to 2007, which was the fastest-improving country of 67 countries tested in the trends in international mathematics and ­science study for this period," said Van der Berg and Spaull.

They cautioned that sets of results could not be compared unless the same level of difficulty was ensured.

"It is only possible to draw valid comparisons between two tests if they are of equal difficulty, but from the report it is unclear what measures were taken to ensure similar difficulty levels across years."

They said, although the report mentions the use of common test items, it was questionable whether these were used for equating purposes.

If this was done at all, it could only have been done for maths, because as the report itself states: "An attempt was made to link the 2012 test to the 2011 test by repeating a few of the items from 2011 in the 2012 test. This was, however, only possible for some grades and specifically with mathematics. Language tests for grades four to six could not link items from 2011, given that the language tests in 2011 were not demarcated into home language and first additional language [as they were for 2012]."

If the results were calibrated to be of similar difficulty in each grade, which is necessary for intergrade comparison, Van der Berg and Spaull asked how was it possible that the grade one mathematics average in 2012 was 68%, but the grade three average was only 41% just two grades later.

In a speech this week Motshekga said the "great improvements" were "extremely encouraging" and should "give South Africans great hope that at this rate, we will reach, or even surpass, the targets we have set".

She said this despite the report noting that, "because the tests were pitched at different levels, no equivalence should be assumed and, therefore, any comparisons should be done with extreme caution".

These improvements were "highly unlikely", said Surette van Staden, lecturer in the University of Pretoria's education faculty. She was on the advisory committee that provided advice on the design of the tests. "The results from the Progress in International Reading Literacy study and the Southern African Consortium on Monitoring Educational Quality don't even show improvements like this in five years," she said.

Having high expectations of rapid improvement in the annual national assessments would be a serious mistake, warned Mary Metcalfe, former higher education director general.

"We need to be sceptical of these results," she said. "It's all just too early. Let's use the first two or three years to make sure that the results are credible and can be used as a solid basis for understanding system improvement."

The basic education department declined to respond to the M&G's questions.

See the full interview with Servaas van der Berg and Nicholaus Spaull at mg.co.za/ANAs2012