t is that time of the year again. Teachers are worried that their school will be an "underperformer" nationally, parents wonder about the quality of education their children receive and journalists are ready to report on what will surely be yet another set of abysmal results.
The encroaching annual national assessments of young children and youths in our public schools will be written this month and promise to be controversial again. These tests, colloquially dubbed "the ANAs", are supposed to measure the literacy and numeracy competence and know-ledge of pupils in various grades.
Pupils from grades one to six and grade nines are tested. It is a mammoth administrative and logistical task, and expensive.
On the department of basic education website, the days, hours and seconds to the day of the ANAs tick by electronically. The nation is waiting for the big event, as it does for the matric results.
The media have headlined the tests' previous outcomes, as if to assure South Africans of the weak state of the nation's education. We do not wish to dwell on this. On these tests, as on the beaten-to-death international surveys, only a small percentage of the school-going population are able to use mathematical and scientific knowledge well and read and write competently.
During the time that schools anticipate the tests, much is done in preparation. Teachers are briefed by provincial education departments on how to prepare for the test and how to conduct and score it. Past papers are posted on the department website, along with what it refers to as "test exemplars". These are examples of questions covering what may be expected in the tests.
Teachers also receive worksheets with typical questions and they do a great deal to try to prepare the young test-takers. They also send some of these "mock tests" home so that parents and other caregivers can help children to practise for the big event — and that is where we noted severe discontent.
At a recent parent meeting of a primary school in greater Johannesburg, parents complained that the mathematics grade two homework was too hard for their children.
The school principal explained to the parents what the ANAs were and how they were expected to support their children. But one parent insisted that he could not do it because he could not understand all the grade two mathematics questions.
This raised an alarm bell with the school principal. She investigated and found that she also struggled to understand some of the terminology used in two African languages in the test items.
When our research group got word of this, we administered the 2012 exemplar tests to grade ones and twos in four classes, two with isiZulu as the medium of instruction and two with Sesotho. As in most urban schools, some of the teaching in these classes occurs in English. We tested 140 children and used the memorandum supplied by the department to score the tests.
The results for the grade two pupils were worse than we had expected. But the teachers in the classes we tested appeared to be hard-working women and the school was well managed. The children's workbooks were filled with exercises, the classes were busy and much appeared to be done. There were good print and other resources at the school and its administration was up to date.
We could not understand the shocking results. Did these grade two learners really not learn enough in their school? Their grade one counterparts did substantially better on the tests — a trend in the national results, too.
Perhaps there were just not as many new concepts in grade one, or perhaps the language used to set the questions was comparatively not as abstract?
Then we started analysing the questions. We looked at the grade one and two mathematics tests in the other languages.
Thinking these may be a problem for pupils in African languages, perhaps owing to mistranslation, we interviewed teachers at suburban schools, both public and private (where the language of the classroom is English or Afrikaans), and they agreed that the language in which the test items for grade two was formulated was too abstract for this level.
As far as we could judge, the tests had been translated directly from the original language (we do not know which it was, but it is likely to have been English) into the other 10 languages.
In the four languages of the tests that we looked at the problems were almost identical. One teacher, who has a degree in mathematics and teaches at a suburban, elite private school, said the pupils in her class would have problems interpreting at least 16 of the terms.
We then used the results from the four classes and did a statistical analysis using what is known as the Rasch model. What we found confirmed our worst fears.
Apart from the obvious difficult language used to phrase questions, which, we think, could be much simpler, the analysis showed that the grade two test's level of difficulty greatly exceeded the ability of the children who wrote it. The test seemed simply unrealistically difficult.
We could not understand why that was so — and then it dawned on us. The motivation behind the ANAs may well be to test curriculum delivery more than the children's mathematical ability.
But then one could argue that, by grade two, much of what is new is learned because it is instructed. So, testing the teacher's work may not be such a bad thing? As long as the teacher sticks to the curriculum, week by week the pupils will learn.
"It is about pacing," said the teachers of the grade two classes where we did this research. "We cannot pace the children according to the curriculum. It expects too much. It wants us to go too fast. There is not enough time to teach properly and to make sure the learners get it."
By now, anyone who does classroom research in mathematics education will know that the majority of the teachers find it impossible to pace children according to the progression expected in the curriculum.
The extreme variance in schools in South Africa is not written into the curriculum. One size is assumed to fit all. So, in the deep Eastern Cape and in rural Limpopo, where there are up to 60 children in a grade two class, learning is assumed to take place at the same pace as in a suburban class with fewer children and more support at home. Is this not a huge fallacy?
But the greater fallacy may be that grade two learners in different types of schools and with different languages will all stand an equal chance to know how to decode terms linguistically, even though they may fully understand the mathematics itself.
Some of the phrases and terms were: "Write the following number symbol for the number names"; "draw the minute hand and the hour hand on each of the clock faces to show the indicated time"; "show the symmetry of letters"; "the use of 'behind', 'next to' and 'in front of' — with drawings of a cow, a tractor and a wrench in a row from left to right"; "complete the repeated addition"; "extend the pattern by drawing the shape that comes next".
Because their teacher reads out the test items orally, the children may not even have time to reflect and to look for a way out of what for many of them is a linguistic maze, although not necessarily also a conceptual one.
We would like all teachers of the foundation phase to know that one of the greatest minds in the history of humankind, Albert Einstein, did not use much language as a young child, perhaps because he was too busy seeing the complexity of the world around him.
The language of mathematics in South African education, as judged by the ANAs, may be creating complexity where simplicity and clarity are sorely needed.
The authors are researchers in the Centre for Education Practice Research at the University of Johannesburg's Soweto campus. Elizabeth Henning is professor of educational linguistics, Dr Graham Dampier is a lecturer in the department of childhood education and Daphney Mawila is a researcher in child cognitive development and testing