/ 19 September 2014

The flaws of national assessments

Given that one cannot teach what one does not know
The tribunal said the tender was for the supply, installation and dismantling of scaffolding and thermal insulation for 15 Eskom coal-fired power stations and was to run over a period of five years. (Reuters)

COMMENT

‘In God we trust. Everyone else, bring data,” billionaire and former New York mayor Michael Bloomberg once said.

This week, about seven million pupils throughout the country sat the Annual National Assessment (ANA) tests, to – we have to trust – provide that data. Grades one to six, nine and a selection of grade eights wrote these tests, which will assess their literacy and numeracy with the laudable goal of providing a statistical window on the performance of South African pupils and schools.

The ANAs are written in class time and invigilated by classroom teachers. Some papers (specifically grades three and six) are marked in marking centres on a district level, and teachers themselves mark the others. “Independent agents” then moderate a selection of question papers in schools.

The department of basic education’s 2013 report on that round of testing stated that there “has certainly been an improvement in the quality and standard of the tests and in the administration of ANA across all provinces”. This raises the question: Why have so many obviously needed improvements not been implemented over the years?

As they stand, the ANAs are largely a waste of taxpayer money and thousands upon thousands of hours in administration, assessment and analysis. The ANAs are fundamentally flawed. Here’s why.

Why the ANAs are a waste of time
Firstly, the mode of assessment is to a large degree subjective, especially in languages and maths. Each year, a new test is devised by a panel of well-meaning teachers. These differ from tests of previous years, containing different biases and texts.

It is understandable that the minute content of the tests should change year by year – but so too do the formats of the tests and the ways marks are allocated. Does this make sense when we are trying to obtain a representative, objective sample of achievement in every given year? In brief, the design of the tests leaves many unanswered questions.

Connected to this, most test questions in both numeracy and literacy papers throughout the sets of question papers I have surveyed – they are all available on the basic education department’s website – require answers to be written out in full. This is excellent practice in the classroom but it is essentially useless for gathering robust data. There will be differences in interpretation and in assessment of any pupil’s performance.

Although it is true that the large sample size may flatten out those inconsistencies, there can in truth be no true objectivity in assessing written work.

International standardised tests such as the Programme for International Student Assessment and the Graduate Management Admission Test use a standard combination of multiple choices and written answers. Why don’t we?

Secondly, the marking itself is problematic. This is especially true in grade nine (an exit year in our system), where teachers mark their own classes. For many teachers, there is often great pressure to complete marking in a short time.

This, combined with matric examinations and the pressures of the job, makes the ANAs a second-tier priority. It follows that there is little incentive to mark accurately and thoroughly.

The best way to get someone to do a bad job is to ask them to do it not only for free but also on top of their already crowded workloads.

Further, teachers are naturally worried about how their own performance is assessed and so are likely to help pupils before, during and also after the test in the form of marking as leniently as possible, wherever possible. The cumulative effect of thousands upon thousands of teachers marking pupils’ almost-but-not-quite-right answers as correct has a skewing effect of the veracity of the data collected.

Thirdly, the ANAs are aligned with a relatively new curriculum policy framework, widely known by its acronym, Caps. This has the obvious effects of unsettling years of pre-Caps entrenched work schedules and work practices in schools. Also, pupils are tested in the third quarter of the year when much of the work tested may not have been covered. As the department’s report stated: “This makes it difficult to compare performance.”

‘Inaccuracy and inconsistency’
Indeed, the report goes on to say that in 2011 pupils “were tested in February on work they had covered in the previous year”. So schools may not have taught certain sections or may be planning to teach lessons on a specific subject after the ANAs are written – negating their entire raison d’être.

Some of these problems could be considered minor in themselves, and no doubt the department has gained some insight from the ANA process over the past four years. However, in combination they amount to a recipe for inaccuracy and inconsistency, and the result is a waste of time and money.

So what is the remedy? Overall, thorough and regular standardisation needs to be implemented. Tests need to be piloted years before they are written, and need to have the same format and the same mark allocation each year.

There should be a focus on questions that leave little room for subjectivity on behalf of the marker, for example by following best practice around the world and using multiple choice-type questions wherever possible.

Whatever choice the department makes about who marks the papers, this should be as consistent as possible and those marking should be satisfactorily reimbursed for that time and effort.

As teachers we are aware of the incredible organisation and work that the ANAs involve. But the process is opaque at best. The department needs to communicate the entirety of the process and its stakeholders to teachers and the public, which it is not doing.

The ANAs are a golden opportunity to gain valuable and trustworthy data about pupil, school and teacher achievement in South Africa. As it stands, we are throwing that opportunity away.

Luke Simpson is a teacher in the Western Cape.