In our last supplement, Being poor matters (Mail & Guardian, September 16-22 2011), we explored the dynamics of poverty and inequality in South Africa and why we are faced with such high levels of both.
What is clear is that they are major challenges for the country; but do we really know what all the causes are? Do we fully understand the nature, size and driving forces of these problems? And are we doing everything we can to address them?
Research evidence can help us understand the complexity of these issues and, crucially, guide us in deciding what needs to be done about them… and by whom.
Reducing poverty and inequality is at the heart of the National Planning Commission’s (NPC’s) National Development Plan: Vision for 2030, released on November 11 2011. It sets out the key challenges we confront and proposes a number of suggestions for addressing these issues. It maps out the kind of South Africa we want to live in 20 years from now and the roads we have to follow to get there.
To illustrate the tremendous challenges many young South Africans face, the NPC produced a video which paints a rather bleak picture of what shape the future of Thandi, a fictional 18-year-old female, might take after completing matric in 2010. Having passed matric, she has her whole life ahead of her, but the legacy of the past is still with us and the limited progress we have made since 1994 determines her life chances.
Thandi’s story
In January 1999, 1.4 million children started school. By 2010, 46% had dropped out and only 600 000 entered matric. Of these, only 13% obtained a university pass. Thandi was one of them. But because she is an African female who went to a school where virtually everyone was poor, her chances of getting into university diminished to just 4%.
Getting a university pass did not mean she could attend university and financial constraints and other reasons meant she had to stay home and look for a job. Sadly, Thandi’s chances of getting a job in the first five years after school are slim, at only one in four, and she is unlikely to ever earn above the median income of around R4000 per month; only 2% do. She may only receive bits of work here and there for the rest of her life and will probably remain below the poverty line of R418 per month.
Ironically, the first time she will break through the poverty trap will be when she turns 60 and receives a state old age pension. But, the NPC tells us, there is hope, “South Africa has the means, the goodwill, the people and the resources to eliminate poverty and reduce inequality, but it will require leadership from all sectors of society, a capable state and a social pact.”
However, to really achieve these goals and impact on people’s lives, we need appropriate policy responses. And to formulate these policies, we need appropriate evidence on which to base them.
The Programme to Support Pro-Poor Policy Development (PSPPD), a partnership between the Presidency and the European Union (EU), held a two-day reflective event entitled “Is evidence the answer? It depends…” in November 2011 to critically examine what role evidence plays in policy-making.
Evidence-informed policy is important
The PSPPD promotes evidence-based policy interventions which address poverty and inequality. One of the ways in which it does this is by providing platforms for critical engagement between policy-makers and researchers. As PSPPD Programme Manager Mastoera Sadan explains, policy-makers need evidence to inform their decisions so they can make better policy choices and improve implementation of those policies.
Policy-making is a highly complex process influenced by many factors, ranging from people’s beliefs, values, knowledge and vested interests to structural, cultural and financial constraints. But with the use of good quality empirical evidence, policy-makers can navigate their way through this often difficult terrain.
“Good quality research can help to uncover the extent of problems and the underlying causes. This is important in deciding where to focus, as well as what interventions are needed to address the root causes,” says Sadan.
Deputy Minister of Rural Development and Land Reform Lechesa Tsenoli agrees, “While evidence can confirm, contradict or amend what is on the table, it must be strong, well-grounded evidence that is presented appropriately.”
How evidence shapes policy
‘Evidence’ refers to the body of knowledge that is being drawn on and used to inform policy decisions. Although evidence from research is the most authoritative and scientific, people use a number of sources of knowledge to make decisions, including their own experience and judgement, informal networks, lobbyists and other purveyors of information.
Of course, the most popular and easily accessible source of evidence is articles from the internet, but the credentials of the author and the validity of the information are often unknown. By contrast, peer-reviewed scientific research, based on rigorous searching of scientific databases, is often the last source of evidence used. This trend needs to be reversed to strengthen the credibility of the evidence used in policy-making.
Evidence can play a crucial role in bringing issues into focus and finding the best policy response, answering questions about what is already known, potential causes, interventions that have been used to address similar problems, and the cost, benefits and effectiveness of solutions. This approach of using scientific research and other evidence to formulate policies is known as evidence-based policy-making (EBPM).
According to Dr Philip Davies, formerly of the UK’s Cabinet Office and current head of UK-based consultancy Oxford Evidentia, “EBPM helps people make well-informed decisions about policies, programmes and projects by putting the best available evidence from research at the heart of policy development and implementation.”
Within the context of a global recession and limited resources, it is even more important to make sure that policy is evidence-based so we know we are using scarce public resources to maximum effect. Also, if a policy is not working as planned, we know what changes need to be made and how. A crucial challenge in the EBPM process is finding relevant, good-quality research in the first place.
As Sadan says, “While governments should depend much more on the systematic use of knowledge, this is easier said than done. Because there are such vast amounts of information available — not all of equal quality, or even useable — we need to not only analyse and manage it, but also interpret it.”
Sifting through the evidence quagmire
The way data is interpreted plays a significant role in the EBPM process because, as Katharine Hall, senior researcher at the Children’s Institute of the University of Cape Town argues, although statistical indicators are powerful tools, they can be meaningless or misleading if analyses are undertaken in a vacuum. Along these lines, Deputy Minister Tsenoli explains that the element of bias also has to be factored in when gathering evidence.
“We must question whether we are generating information that merely satisfies our own prejudices,” he says. Building on this idea, Professor Mary Metcalfe, former Director General in the Department of Higher Education and current sector specialist at the Development Bank of Southern Africa (DBSA), points out that the assumptions you make about a problem shape the questions you ask.
“It is important to interrogate the assumptions and our understanding of the problem before we even begin to look for evidence. Having the right questions in order to get the best information and deepen the explanations depends on understanding processes of social exclusion and the complex way in which they inter-relate. And we have to understand the changing social dynamics, continually testing assumptions and being open to new questions.”
Getting research into policy
Making research available and accessible to policy-makers is fundamental in the success of EBPM. “Research can be reported in many ways. For a non-academic practitioner audience, it is important to strike a balance between rigour and accessibility, so that reports are easily readable without being simplistic, and objective while still making policy-relevant points. Wide reach is important as well, from working with advocacy groups and harnessing the media to, as a last resort, using research evidence in litigation,” suggests Hall.
Carmine Rustin, chief researcher in Parliament, agrees. “There is a need to translate research from the realm of academia into manageable chunks for the legislative community, showing linkages with policy,” she says, highlighting that policy-makers are faced with enormous time constraints and are forced to process huge amounts of information relatively quickly. Generating good quality research from raw data fast enough to keep up with the demands for it in policy-making processes is a pressing issue; policy-makers need timely answers and do not have time to wade through a profusion of evidence.
Furthermore, as the Overseas Development Institute (ODI) reveals in its Does evidence matter? meeting series, “Each policy-maker has to cover vast thematic fields and cannot possibly have in-depth knowledge about every issue in those areas. They are therefore heavily dependent on the knowledge and integrity of the people who inform them. This raises difficult questions about who policy-makers should turn to for advice and how they can judge the advice given to them.”
“Research has found that where there is timely and relevant evidence and interaction between researchers and policy-makers, there is better use of the evidence. This implies that researchers and policy-makers need to be working together to identify problems and policy options and consider implementation obstacles,” says Dr Taryn Young, Director of the Centre for Evidence-based Healthcare at the Faculty of Health Sciences, Stellenbosch University.
The PSPPD works in the gap that often exists between researchers and policy-makers by promoting the informed supply of evidence by researchers and the effective demand of it by policy-makers.
“There is a need to build the capacity of researchers and people who work in policy to communicate with decision-makers because, while they are very good at generating ideas and discovering new ways of how things work, they often don’t necessarily have the ability to persuade the rest of the world that that is the appropriate route to take,” explains Deputy Minister Tsenoli. “In addition, we need to build the capacity of departments to critically review evidence.”
Building this kind of capacity is another focus area of the programme, together with building capacity for the development of good quality research and for the effective use of the findings. To further enhance this policy-research interface, the PSPPD has funded 13 research projects and a range of other knowledge collation, learning and capacity development activities.
No magic bullet
Ultimately, however, as Davies cautions, evidence doesn’t solve policy problems; it only gives people the tools to come up with solutions. It informs decisions and suggests directions and possibilities for effective interventions. It helps us identify what we know which in turn helps us identify what we don’t know.
Answering the question posed by the PSPPD event, “is evidence the answer?” Davies tells us the short answer is no, not on its own. “There is no magic bullet for improving government,” he says. “Many factors other than evidence influence policy. But the use of high quality evidence, when combined with those other factors, can be a formidable tool for determining the effectiveness, implementation and impact of policies, and ultimately, for improving people’s quality of life.”
Case study: The child support grant
Probably one of the best case-studies of the relationship between evidence and evaluation is South Africa’s child support grant (CSG), a shining example of the power of cash transfers in breaking the poverty trap and government’s most successful poverty reduction programme. Evaluations of the CSG leave no doubt about the positive effects it has on the lives of South Africa’s poor children, reducing hunger, promoting nutrition, improving school attendance and increasing positive health outcomes.
Since its introduction in 1998, the CSG has continued to grow rapidly in terms of coverage, and its rand value has exceeded inflation. In 2002, only 1.8-million children were benefiting from the grant and it was limited to children under seven years old. Less than a decade later, the grant reaches almost 11-million children under the age of 18 and stands at an amount of R270 per month, almost three times its starting value of R100.
Although the amount of the grant is small, research and evaluation studies show the positive impact the CSG has on reducing child poverty. There are various reasons for the expansion and improved delivery of the CSG. At its core is research, which has been taken up by civil society, the public, the media and government as part of their campaign for the extension of the CSG’s coverage.
Research and advocacy around implementation barriers, for example, led to improvements in service delivery and take-up by, for instance, allowing alternative documentation for children without birth certificates who had previously been excluded. A combination of other initiatives and advocacy work which contributed to the improvements also include:
- Expansion campaigns resulted in increases to the age threshold and the means-test threshold;
- Public education on the CSG and how to access it also drove take-up;
- Empirical evidence on targeting and exclusions informed policy discussions and even court cases;
- Budget work enabled cost-analysis of various scenarios.
A strong research agenda and a joint commitment to improving the lives of children have enabled government, researchers and civil society to bring about evidence-based adjustments to the CSG. Although positions may at times verge on being adversarial, this partnership is essential for improving policy and implementation.
Case study: Outcomes-Based Education
The now-defunct Outcomes-Based Education (OBE) programme is a prime example of how research can shape attitudes. From the outset, when it was implemented in 1997, Jonathan Jansen, then at the Macro-Education Policy Unit of the University of Durban Westville, and now Vice-Chancellor of the University of the Free State, threw light on the shortcomings of the approach from a technical perspective in his paper Why OBE will fail? Papers like this sparked debate among education institutions and civil society and were instrumental in the political impetus behind the change to Curriculum 2005.
By 2000, the Minster of Education at the time, Kader Asmal, called for a review of Curriculum 2005, proposing changes as outlined in this research. More evidence generated from the ministerial review committee found that implementing the curriculum had been held back by a host of challenges, including excessive use of jargon, a flawed curriculum design, inadequate training and development of teachers, lack of emphasis on textbooks, inadequate district support and implementation time-frames that were too short. Several internal and external evaluations were also held in the period leading up to the ministerial review.
Once findings from the original research filtered into the public domain, many of the issues were picked up and further investigated in subsequent research. Continued research by universities and science-councils contributed to the critical mass of evidence rein-forcing the message that OBE was not working.
At first the trade unions and the ANC resisted the idea of abandoning OBE, but attitudes shifted over time. Researchers and key people in government and trade unions persisted in demanding evidence about the impact of OBE. The South African Democratic Teachers Union even commissioned a study on the effects of OBE on teachers, which showed that it increased their workload. International evaluations of OBE were examined. In the end, the unions and the ANC were fundamental in catalysing policy reform for the OBE system. And in 2010, the new curriculum, Schooling 2025, was launched.
Evaluating what works and what doesn’t… and why
Understanding how its programmes and services are working through monitoring and evaluation can be an important part of the policy-making process for government, as well as a key source of evidence.
Evaluations can contribute to the improvement of government -interventions by providing evidence-based assessments of their relevance and performance. Evaluations can strengthen accountability by providing reliable information on progress in the achievement of government’s objectives and identify the key factors driving success or failure, which can then be addressed to improve performance. It is about continually learning what works and what doesn’t — and why.
To make evaluation more widespread in government, the Presidency’s department of performance monitoring and evaluation (DPME), supported by the PSPPD, has developed the national evaluation policy framework, which was adopted by Cabinet on November 24 2011.
The framework’s main purpose is to improve the effectiveness, impact and accountability of government, by reflecting on what is working and what is not working, and assisting government to revise its programmes-and-policies accordingly. It seeks to ensure that evidence from evaluation is used in planning, budgeting and on-going project management.
Reviewing the research
It is clear that research is not enough. Collecting that research, appraising it, analysing it and then synthesising it into accessible formats are essential if it is to be taken up in the policy-making process. EBPM tools such as systematic reviews and rapid evidence assessments (REAs) can be used highly effectively in finding the relevant evidence to help clarify key policy issues and make better policy. These tools also help identify where there are gaps in the evidence base and where new research may be needed.
Systematic reviews can take up to two years to complete and are the Rolls Royce of evidence synthesis. However, the shorter version, REAs, are also very valuable and link more easily with the pressurised timescales of government. REAs, which were pioneered in the British government, typically take three to six months and, although some rigour has to be sacrificed, they are still likely to pick up much of the available evidence and enable policy-makers to proceed more quickly with the evidence they have found.
The issue of violent crime in South Africa was the subject of a recent REA commissioned by the PSPPD with the objective of answering the question “why is crime in South Africa so violent?” This REA, which is available at www.psppd.org.za, examined a number of existing research studies on crime and violence, including the much referred to Centre for the Study of Violence and Reconciliation (CSVR) Report The Violent Nature of Crime in South Africa.
“The reviewed evidence suggests that the reasons for violence and violent crime in South Africa are a combination of political-historical, environmental and individual — factors,” explains the REA. “Poverty, unemployment -inequality and social exclusion also contribute to South Africa’s burden of -violence, but are inseparably related to these key factors.”
“Poverty, unemployment in-equality and social exclusion also contribute to South Africa’s burden of violence, but are inseparably related to these key factors.”
This document has been produced with the financial assistance of the Programme to Support Pro-Poor Policy Development (PSPPD), a partnership programme of the Presidency, Republic of South Africa, and the European Union. The contents of this report can in no way be taken to reflect the views of the Presidency (RSA) and/or the European Union.
For more information visit http://www.psppd.org.za/
This article originally appeared in the Mail & Guardian newspaper as an advertorial