To enjoy the full Mail & Guardian online experience: please upgrade your browser
08 Jan 2010 09:46
Basic Education Minister Angie Motshekga has done the right thing by stepping into the matric exam mess in Mpumalanga and deciding that the national department will take over the administration of its exams. Stupidity and corruption among education officials in that province have for too long blighted the education experience and prospects of learners.
Mpumalanga matric results came under scrutiny when five papers were leaked in October.
Thirteen people, including an official from the correctional services department, a worker at Clicks, a teacher and 10 learners, were arrested after police found the syndicate photocopying maths, physics and accounting question papers at an internet café in Barberton.
Motshekga’s decision to release this province’s results came at the 11th hour, meaning that traumatised candidates had to agonise over their results.
Writing matric exams is not a picnic and experiencing delays in accessing your results is traumatic.
That is crass administrative incompetence. But it’s even worse: a document in the possession of the Mail & Guardian reveals about 15 problems with the province’s processing of the marks—including incorrect calculations.
Frustrated former education minister Naledi Pandor has complained to the M&G about the incompetence of officials who stored exam scripts in one location and had their computer centre in another, far-off, place, so that each time there was a query they had to trek a distance to locate an exam script.
In 1998, when I still harboured dreams of this country eventually getting a functional education system, I reacted with caution when Mpumalanga suddenly achieved a 72% matric pass rate. An investigation then found the results had been inflated by 20%.
The department’s temerity in believing it could get away with cheating is something that has remained with me. And the education member of the executive committee at the time of the marks inflation was David Mabuza—the current premier of the province.
He and his education cronies have been silent about the latest saga. They have largely left the handling of the scandal to Motshekga and Umalusi.
Is Mabuza hoping that by saying nothing people will forget the incident? Where is the accountability? Why is it that the Western Cape manages its exams relatively well? Is it because the people who do the job happen to have functioning grey matter?
With the Mpumalanga education department coming under the care of the national education department I hope the national chief director of exams, Nkosinathi Sishi, will emulate University of Free State vice-chancellor Jonathan Jansen—a maverick who has both the commonsense and the balls of steel needed to go into a messy situation and clean it up without any fear.
And Motshekga needs to send an expert team to investigate all the structures and deficiencies in the provincial department, as Pandor did in the Eastern Cape early last year. If she does not resolve the whole mess, the system will remain a tragicomedy
Pushing the limits
Quality watchdog Umalusi legally has the right to adjust (or ‘standardise”) marks within certain parameters when there is reason to believe that the performance of candidates has been unfairly affected by factors in the examination process.
Professor Nan Yeld, dean of the University of Cape Town’s Centre for Higher Education Development, said Umalusi routinely undertakes procedures to ensure the integrity of exam results, including moderating question papers and standardising marks according to statistical and educational principles.
Academics have long expressed sceptiscim about the adjustment of matric marks. They argue that under former education minister Kader Asmal mark adjustments were pushed to the 10% limit. There have also been claims that 2008’s mathematics exam had too few challenging questions and that Umalusi got things wrong in its adjustments, resulting in too many matriculants achieving A symbols.
Factors taken into account in the standardisation process include historical averages—usually the past five years—and ‘pairs analysis”, in which performance in two cognate subjects is compared. ‘Furthermore, Umalusi would analyse examination papers themselves to see if there were any items that skew performance,” Yeld said.
A big danger in relation to standardisation is the temptation for politicians to exert pressure to raise results and for bureaucrats to give in to this, said Yeld. Pressure to demonstrate the success of educational innovations or curriculum reform can distort the assessment system in many ways.
Yeld said Umalusi maintains that no adjustments, upwards or downwards, will exceed 10%—unless there are compelling reasons.
But in the case of individual candidates it is possible for adjustments to increase marks by up to 50%.
Read more from Prim Gower
Create Account | Lost Your Password?