/ 23 September 2016

​Shaky data skews literacy results

Shaky Data

COMMENT
Every few years South Africa participates in international tests of reading, maths and science to see what learners know and how this is changing over time.

These tests are independently set and usually comparable over time. Every four or five years we test our grade nines in maths and science and our grade fours and fives in reading (Progress in International Reading Literacy Study, Pirls), and every six years our grade six learners in reading and maths (Southern and Eastern African Consortium for Monitoring Educational Quality, Sacmeq).

This year the results of these assessments are being released to the public. The 2015 grades four, five and nine results will be released in November and December and the 2013 grade six results were presented to Parliament earlier this month.

In what should have been the biggest news of the post-apartheid period, Parliament heard that the primary education system improved faster than any other education system in the history of international testing — that is, since 1967. Our alleged improvement was more than twice that of the fastest improving country in the world, Brazil.

To be specific, grade six pupils’ test scores improved by more than 0.9 standard deviations between 2007 and 2013 or the equivalent of an extra three years’ worth of learning. To put this in perspective, this is the same as taking Thailand’s or Mexico’s education system and making it equal to Finland’s or Canada’s in six years.

It makes for a great story or a wish from a fairy godmother, but not for plausible results from a psychometrically rigorous international test. Note that it is not only South Africa that experienced these colossal “gains”, but all Sacmeq countries, which is even more suspicious.

A big part of the alleged Sacmeq improvements arise from different methodologies employed in 2007 and 2013, making them incomparable until they are properly equated.

The results presented to Parliament compare data from 2007 and 2013, yet the way these results were calculated in each period was not the same, and I should I know. I was appointed by Sacmeq earlier this year to analyse the results for the international Sacmeq report.

After analysing the data I raised a number of serious technical concerns about the data that significantly affect the comparability and validity of the findings, and especially the fact that the weaker learners had been excluded from the final analysis.

I advised the Sacmeq secretariat to address these concerns before publishing the results because doing so would be misleading. Based on the subsequent response from the Sacmeq secretariat indicating that this would not happen, I said that I could not, in good conscience, continue with the analysis and resigned on technical grounds in August.

The issues I raised have not been addressed — the results presented to Parliament were the same as those that I identified as problematic. When this was going on I emailed the department of basic education to flag my concerns and cautioned against publishing the results.

The department was shocked by the unprecedented improvements. In the presentation to Parliament they said: “Given the significant improvements, the South African national research team requested Sacmeq to double-check the results and were subsequently reassured on their accuracy.”

This is simply not good enough.

The lack of comparability between 2007 and 2013 is so glaringly obvious one doesn’t need inside knowledge of the data to see how implausible the results are.

At the same time that the learner reading scores soared (rising by 0.9 standard deviations), the teacher reading scores plummeted (dropping by 0.8 standard deviations), which is extremely peculiar.

If we are to believe the results, by 2013 basically all South African learners could read, with illiteracy rates dropping from 27% in 2007 to 3% in 2013.

This is at odds with Pirls, the other international test South Africa does.

In 2011 it showed 29% of grade four students were reading-illiterate and 58% could not read for meaning, confirming a host of smaller studies showing the same thing.

If we dig a little deeper, the department’s presentation to Parliament apparently showed that the biggest improvers were Limpopo and the Eastern Cape. These are the very same provinces that were placed under administration (Section 100) in 2011 because they were dysfunctional. To use the minister’s own words, these are the education system’s “pockets of disaster”. Yet Sacmeq would have us believe that illiteracy in Limpopo has been eradicated, dropping from 49% in 2007 to 5% in 2013.

In stark contrast, our other international test, Prepirls, showed that, of the more than 2 900 grade fours tested in Limpopo in 2011, 50% were reading-illiterate and 83% could not read for meaning.

The sad thing about all of this is that it does seem that South Africa is really improving — other reliable evidence points to this — but not nearly as fast as the Sacmeq test scores would have us believe.

According to the presentation, the Sacmeq questionnaire data also shows that learners’ access to their own textbooks increased substantially over the period from 45% to 66% for reading textbooks and from 36% to 66% for maths textbooks. This is good news.

In the latest turn of events the department explained that the results presented to Parliament were “preliminary”, that an “extensive verification process” is underway, and that it is “fully aware of the issues raised in this regard”.

Why then did it choose to go ahead and present questionable results to Parliament?

Apparently researchers — aka me — have “misled the public” and my motives are “unclear”.

There is nothing unclear about my motives — there is a major technical concern and the public should not be misled into trusting these results presented to Parliament.

There is also no uncertainty about whether the Sacmeq’s results should have been presented to Parliament. They should not have been presented while there is still so much uncertainty around the comparability of the results.

The department has been aware of the serious technical concerns about the results since I emailed a number of members of the department’s research team many months ago. I drew attention to these problems and cautioned against publishing any results until they could be rectified.

What I do not understand is why the department would undermine their own technical credibility by presenting questionable results to Parliament.

I would also not be surprised if the Sacmeq data — once comparable — did show an improvement in line with those of other studies.

Soon we will also have the Pirls grade fours and fives results of 2015 as another data point to verify what is going on.

In South African education there is probably a good story to tell, so why muddy the waters by reporting impossible improvements based on dodgy data?

The department and Sacmeq must make sure the results of Sacmeq 2007 and 2013 are strictly comparable before reporting any further results and causing additional confusion.

Nic Spaull (PhD) is an education researcher at the universities of Stellenbosch and Johannesburg, and a Thomas J Alexander Fellow at the Organisation for Economic Co-operation and Development. He blogs at NicSpaull.com.