Minister wrong to say ‘better results’ show SA education is improvingComments 1
Matric results are getting better. This is good news for the students concerned. But the minister of education is wrong to say the results show government strategy is ‘improving education quality’.
Researched by Mandy de Waal
When South Africa’s Minister of Basic Education, Angie Motshekga, revealed the matric results for 2012 earlier this month, she had good news for many students and their parents.
Matric results, which have been getting better for several years, are still improving, she said. The national pass rate in 2012 rose to 73.9%, up by more than 13 percentage points from the 60.6% who achieved a pass in 2009 and by more than three percentage points from the 70.2% who did so in 2011.
That is undoubted good news for the students concerned. But what does it mean about the overall performance of the education sector?
“Our national strategy for improving literacy and numeracy has assisted in improving education quality,” she stated clearly in her remarks in Johannesburg on 2 January.
To back up her claim, she recalled the results of the Annual National Assessments (ANAs), piloted in 2008 and first published in 2011. These tests cover some 7 million learners, and showed some improvement, particularly in Grade 3 literacy and numeracy, she said.
Do results show SA education is improving?
Measuring the quality of education is notoriously complex and most agree that no one measure, alone, is a good indicator of performance. All have blind-spots. What about matric results and the ANA tests?
What the matric results don’t show
It might seem obvious but a first flaw in using the matric pass rate to assess the performance of South Africa’s public education system is that it only tracks those learners who make it through to take the matric and ignores those – normally about half of their year – who drop out of the school system early.
The matric pass rate is calculated as a percentage of students that are enrolled in Grade 12, and this makes it a flawed measure of national performance.
As Nicholas Spaull, a researcher at Stellenbosch University who focuses on primary schooling, told Africa Check: “Students are pushed through the system until grade 10, and then schools realise that if they put these kids through, they are not going to pass grade 12.”
“Getting low pass rates in matric is problematic for schools, so they weed out these students,”
The matric rate is thus bumped up and gives no indication of how the 50% that leave the system are doing.
Schools push pupils into easier subjects
Another way that schools play the system is by pushing their students to take subjects that are easier to pass.
In 2009, about 51% of students took maths and about 49% took the easier subject of maths literacy. In 2012, only 44% of the students who passed took maths, while 56% studied maths literacy.
As Equal Education’s Doron Isaacs told Africa Check this provides a better matric pass rate but not a better education.
“The pass rate can go up because there is a genuine improvement, or because there is a shift of students out of physical science and into easier subjects like business studies, or tourism, or consumer studies,” he said.
Good performances skew the average
A third problem with using the matric results is that strong performances in the 25% percent, or so, of schools that perform well mask the poor performances in the 75%, or so, of schools judged as dysfunctional.
This thus skews the average, and does not present a true reflection of the mean for most students.
Are the ANAs a better judge?
So are the Annual National Assessments (ANAs) a better judge of national educational performance? Here again, according to the education experts, there are problems.
First, educationalists such as Spaull have a number of serious methodological concerns about the construction of the tests, as set out in a recent article in the Mail & Guardian.
As Isaacs pointed out, the marking of the ANA script is not nationally moderated. “It is not done the way the matric marking is done which is standardised.”
Then there is the issue of the point in the year that the tests are carried out.
As Brahm Fleisch of Wits’ Educational Leadership and Policy Studies points out, the 2011 test was done in February that year, whereas the 2012 round was done in September last year.
“There are substantial gains made by kids in that number of months. To make claims about improvements about tests that were made at different times of the year must be done with great caution,” Fleisch told Africa Check.
The improvement in the matric pass rate is good news for those concerned but Minister Motshekga is on shaky grounds claiming that the matric and ANAs results show: “Our national strategy for improving literacy and numeracy has assisted in improving education quality.”
Firstly, the matric results cover only about half of those who entered school together, the other half having either dropped out, for a variety of reasons, or been encouraged to leave by schools fretting about their pass rate. Secondly, it is clear that many schools are playing the system, pushing learners into easier subjects in order to improve their matric pass rates.
Encouragingly, a better measure, the internationally backed Trends in International Mathematics and Science Study (TIMSS) does give some grounds for cheer. Although there was no improvement in South Africa’s performance between 1995 and 2002, between 2002 and 2011 there was a relatively large improvement of about one and a half grade levels of achievement in maths and science.
This is positive news but it still puts South Africa at the bottom of practically all countries that participate – including other developing countries.
Edited by Peter Cunliffe-Jones