Despite the understandable focus on matric pass rates and what these suggest about standards, it might be useful to take a step back and look at issues surrounding the matric examination, distanced as far as possible from any particular year’s results.
The new curriculum — the national curriculum statement (NCS) — is a considerable improvement on the stagnant and routinised old one underlying the old senior certificate.
It is also important to note that in some subjects a stage-wise approach has been adopted, so that the full curriculum will be taught and examined only once the system has developed sufficient capacity (cases in point here are mathematics and mathematical literacy). Learners have therefore not necessarily been taught or assessed on the full curriculum in many subjects and NCS results should be understood in this context.
In most educational systems there are differences between “the curriculum” (what the official documents contain), “the taught/learned curriculum” (what the system actually manages to deliver in the classroom) and “the examined curriculum” (those parts of the curriculum that are reflected in the examination papers). Particularly in relation to the last of these, it is inevitable that certain topics or processes lend themselves to assessment more readily than others.
The problem is that these often come to dominate examination papers and in turn to exert pressure on teaching. However, in these early stages of implementation of the South African curriculum for the senior secondary schooling phase, and the new qualification, the differences between these three are worryingly large, and this needs to be monitored and addressed as a matter of urgency.
A further process that is often misunderstood is that of standardisation. This is an essential tool, used to correct fluctuations in performance that are the result of factors within the examination processes themselves rather than the knowledge and abilities of candidates, and which unfairly have an impact on candidates. Per se, carefully conducted standardisation procedures should not be a cause for concern, and I believe that Umalusi has done and is doing a sterling job in this regard.
But it is in the vexed area of examination paper setting, and the pressure this puts on standardisation, that I believe alarm bells should be ringing.
First, the generally accepted provision that there should be a separation between player and referee is not wholly supported in the current dispensation.
The department of education, as the provider of education, is also the body that sets and conducts the examinations. Umalusi’s role, while purportedly independent, is quite opaque when it comes to who finally can call the shots on quality.
Second, there is considerable evidence pointing to poorly set examination papers. Umalusi is therefore forced to use standardisation to correct poor examination paper setting. This makes it very difficult for anyone to understand what candidates actually achieved and contributes to the suspicion generally felt by many about standardisation.
This last point connects to a key issue for higher education: the NCS results so far are not easily interpreted in terms of what learners know and can do, since standardisation tends to mask real performance levels. In addition, besides being new and therefore unfamiliar, the new qualification permits students to “pass” at several different levels, thereby qualifying them for entry to bachelors, diploma or certificate study.
The differences between these categories are difficult to understand in any meaningful way and have not been satisfactorily articulated in terms of what candidates are bringing to the different learning contexts of these levels of study.
At present, graduation rates at South Africa’s higher education institutions show that the sector is not coping with the students it admits. For example, only 50% of all the students who entered universities in 2000 graduated within five years. The situation will become much more acute if more students enter higher education on the basis of poorly understood results.
In 2005 Higher Education South Africa (Hesa) embarked on the National Benchmark Tests Project (NBTP) which has developed tests designed to assist higher education institutions to identify the educational needs and achievements of their incoming cohorts and to design appropriate curricula. The primary aim of this higher education initiative is to meet two challenges: to improve graduation rates and to help interpret NCS results.
The tests developed by the NBTP are intended to be used in addition to the NCS and to provide additional, complementary information on performance levels in core underlying abilities, in the domains of (i) academic literacy, (ii) quantitative literacy, and (iii) mathematics.
These tests, while based on school curricula, are set against standards based on expectations of performance levels of students who will succeed at first-year level, given appropriate levels of support identified in the tests — the tests look forward to the kinds of educational challenges and therefore the educational provision that is needed, rather than back at achievement on a set curriculum.
As the results will not be manipulated through standardisation processes such as those described above, they will accurately reflect actual levels of achievement. In addition, since they will be written during the final year of schooling, they will give higher education institutions advance notice of the extent of the educational provision required to meet students’ learning needs (for example, whether a third of the incoming class in science needs preparatory work of some kind), as well as individual information that could be used at admissions to place students effectively in programmes that will meet their needs.
Far from challenging the validity of the NCS results, the NBTP should be viewed as a legitimate tool developed by higher education to improve educational conditions within that sector. The NCS needs to focus on its main role: to provide a fair and equitable certificate for many hundreds of thousands of students, the majority of whom will not proceed to higher education. The NBTP, in contrast, aims to provide finer-grained diagnostic information about students intending to study further: in doing so, it will also provide information that should be of value to schooling.
Professor Nan Yeld is dean of the Centre for Higher Education Development at the University Cape Town and head of the National Benchmark Tests Project