/ 13 June 2014

Education rankings: There’s madness in WEF methods

Graphic: John McCann
Graphic: John McCann

In the past two weeks the South African media has had a field day lamenting the state of maths and science education in the country. This is because the World Economic Forum (WEF) recently ranked South Africa 148th (out of 148 countries) on the quality of its maths and science education.

Let me cut to the chase and say, unequivocally, that the methods used to calculate these education rankings are subjective, unscientific, unreliable and lack any form of technical credibility or cross-national comparability. I am not disputing that South Africa’s schooling system is currently in crisis (it is), or that South Africa performs extremely weakly relative to other low- and middle-income countries (it does). What I am disputing is that these “rankings” should be taken seriously by anyone or used as evidence of deterioration (they shouldn’t).

The mistakes in the WEF’s methodology are so egregious that one needs only look at the list of countries and their respective rankings to appreciate how ridiculous they really are. How is it possible that the quality of maths and science education in failed states such as Chad (ranked 127th on the WEF list), Liberia (125th) and Haiti (120th) is better than modernising middle-income countries such as Brazil (136th) and Mexico (131st)? How do countries such as Madagascar (82nd) and Zambia (76th) outrank countries such as Israel (78th), Spain (88th) and Turkey (101st)?

Preposterous
Although these preposterous rankings sound like an April Fool’s joke gone wrong, they are reported without qualm on page 287 of the WEF Information Technology Report 2014. Even a cursory analysis of the faulty ranking methodology the WEF employed shows how it is possible to arrive at these outlandish “rankings.” The WEF asked between 30 and 100 business executives in each country to answer questions (relating only to their own country), using a scale of one to seven to record their perceptions, with one representing the worst possible situation and seven the best possible situation.

The question relating to maths and science education was phrased as follows: “In your country, how would you assess the quality of maths and science education in schools?” with “one” being “extremely poor – among the worst in the world”, and “seven” being “excellent – among the best in the world”.

In South Africa, 47 business executives were surveyed for these rankings. On the question relating to maths and science, the average score among these 47 executives was 1.9, indicating that the vast majority of these South African business executives believed that the quality of maths and science education in the country was “among the worst in the world.” Yet this is really just a measure of the perceptions of these 47 businessmen, as the department of basic education has correctly pointed out.

By contrast, when the 55 Malawian and 85 Zambian business executives were surveyed, they were more optimistic about the maths and science education provided to students in their countries, yielding average scores of 3.2 and four respectively.

Outperform
This explains why Malawi ranks 113th and Zambia ranks 76th whereas South Africa ranks 148th. Yet we know from objective cross-national standardised testing in the region that Zambia and Malawi are two of the few countries that South Africa actually does outperform.

Clearly the ratings given by these business executives are subjective and dependent on their particular mental reference points, which obviously differ by country. These 47 South African executives were not asked to rank South Africa relative to other specific countries – such as Madagascar, Malawi or Mali – only relative to “the world”.

Although the perceptions of business executives are important in their own right, it is ludicrous to use these within-country perceptions to rank “the quality of maths and science education” between countries; particularly when we have objectively verifiable, cross-nationally comparable scientific evidence for maths and science performance for at least 113 countries.

Looking at South Africa specifically, we participate in two major cross-national standardised testing systems that aim to compare the mathematics and science performance of South African students with that of students in other countries. The Trends in International Mathematics and Science Study (Timss) tests grade eight students from middle- and high-income countries, and the Southern Africa Consortium for Monitoring Educational Quality (Sacmeq) study tests grade six students from 15 countries in sub-Saharan Africa.

Worse than South Africa
Of the countries participating in Sacmeq, South Africa came 8th in maths, behind much poorer countries such as Kenya (2nd), Swaziland (5th) and Tanzania (3rd), but ahead of Mozambique (10th), Namibia (13th), Zambia (14th) and Malawi (15th). Although this situation is no cause for celebration, it does show that these countries – which outrank South Africa in the WEF rankings – are in fact doing worse than South Africa in reality.

If we look beyond Africa to the Timss rankings, South Africa performs abysmally. Of the 42 countries that participated from around the world (including 21 developing countries), South Africa came joint last with Honduras in 2011. This should shock us to the core. But it does not mean that we have the worst education system in the world. Rather, we have the worst education system of those 42 countries that take part in these assessments.

There is a big difference. Only 21 developing countries took part in these assessments, but there are around 115 developing countries in the WEF tables. The fact that Mali, Madagascar, Liberia and Haiti (for example) do not take part in these assessments means that business executives in these countries have very little reliable information on the quality of education in their countries.

In South Africa the basic education department has wisely chosen to take part in these assessments so that we have reliable information on the performance of our education system, however low that performance might be.

Continuing participation
This is one thing that the department should be commended for –that is, for continuing to participate in these assessments, which provide valuable information, despite being lambasted by their findings.

Perhaps the best example of how flawed the WEF methodology is is illustrated by comparing Indonesia and Japan on the WEF rankings and on the well-respected Organisation for Economic Co-operation and Development’s Programme for International Student Assessment (Pisa) rankings, which also tests math and science, as does Timss.

In the WEF rankings, executives in Indonesia and Japan both gave an average score of 4.7 for the quality of maths and science education in their respective countries. This placed Japan 34th and Indonesia 35th of the 148 countries. Yet, of the 65 countries participating in the 2012 round of the Pisa maths and science testing, Japan came 7th (out of 65) and Indonesia came 64th. Go figure.

Although there are some early signs of improvement in the South African education system, we know that things remain dire. South African students perform worse than all middle-income countries that participate in assessments, and even worse than some low-income African countries.

But to claim that South Africa has the worst quality of maths and science education in the world, and to use executives’ perceptions over scientific evidence to do so, is irrational and irresponsible.

The WEF has seriously undermined its own technical credibility by reporting these ridiculous education rankings. Until it rectifies its methodology, no one should take the rankings seriously.

Nic Spaull is an education researcher in the economics department at Stellenbosch University. He can be followed on Twitter (@Nic­Spaull) and his research can be found at nicspaull.com/research