What's Behind Australia's Tottering PISA Results?Thursday February 16, 2017
This is a summary of a discussion paper published by Save Our Schools. The paper can be downloaded below. Comments are invited.
The main features of the PISA 2015 results for Australia are:
• Continuing declines in mean scores in reading, mathematics and science across all states, ability levels, school sectors and demographic groups, but particularly provincial students;
• High proportions of low SES, Indigenous, provincial and remote area students are not achieving the international reading, mathematics and science standards;
• Significant increases in the proportion of students below the international standards in all states except Victoria, school sectors and amongst most demographic groups, particularly in mathematics;
• Continuing declines in the proportion of students at the most advanced reading, mathematics and science levels across all states and school sectors, and in mathematics and science amongst most demographic groups;
• Continuing large achievement gaps in reading, mathematics and science between high SES students and low SES, Indigenous and remote area students of three to four years of learning and the persistent large achievement gap between high and low SES schools of over three years of learning;
• Continuing very large achievement gaps between the highest and lowest performing students of ten or more years of learning.
1. Mean scores
Average reading, mathematics and science scores for 15 year-old students in Australia have declined significantly over several years. The decline in reading is equivalent to nearly one year of learning while the decline in mathematics is equivalent to a year of learning, and the decline in science is equivalent to about half a year. The declines were amongst the largest in the OECD.
There were significant declines in results for all subjects and across all states except Victoria. There were large declines of over a year of learning in reading in NSW, Western Australia, South Australia, Tasmania and the ACT and in mathematics in Western Australia, South Australia and the ACT. New South Wales, Western Australia, South Australia, Tasmania and the ACT all had declines in science of over half a year of learning.
The timing of the declines differed between subjects. Over half of the decline in reading occurred between 2000 and 2006 with the rest between 2009 and 2015. Mathematics has continually declined since 2003, while the decline in science has occurred since 2009.
There were significant declines across all ability levels, with slightly larger declines amongst the lowest performing students. The extent and timing of decline differed between the highest and lowest performing students. All the decline in reading amongst the highest performing students (90th & 95th percentiles) occurred between 2000 and 2006 and scores have changed little since then. In contrast, about 75 per cent of the decline amongst the lowest performing students (5th & 10th percentiles) occurred in 2015 when scores declined by 23 and 21 points respectively. There was only a small decline in mean scores for low performing students between 2000 and 2012.
The decline in mathematics amongst high performing students was concentrated in two tests – 2006 and 2015 – with over half the overall decline occurring in 2015. The decline amongst low performing students has continued since 2006. The declines amongst students at the 5th & 10th percentiles were 36 and 35 points since 2006, which was much larger than the decline for students at the 90th &* 95th percentiles of 20 and 19 points.
Low performing students had slightly larger declines in science than high performing students. The declines for both groups have occurred since 2009.
Mean scores have declined in all subjects in public, Catholic and Independent schools since 2009, when results were first published by school sector. The declines were similar across all subjects in public and Catholic schools, but slightly smaller in Independent schools. There were some differences between sectors in the timing of the declines in reading and science, but not in mathematics.
Most demographic groups experienced large declines in results. Provincial students had the largest declines in reading, mathematics and science, with declines in reading and mathematics of over a year of learning and a decline in science of about one year of learning. Encouragingly, the decline in reading and mathematics for Indigenous students was about half the Australian average and the decline in science was much lower than the Australian average, although they remain well behind in all fields. The decline in reading and science amongst remote area students was also very low compared to the Australian average. The declines for first generation immigrant students were less than the average for Australia. High and low SES students had similar declines in mathematics and science, while high SES students had a larger decline in reading.
The timing of the decline in reading varied between demographic groups. The large part of the decline for high SES and Indigenous students occurred between 2000 and 2006 and results have been relatively stable since then. After a decline between 2000 and 2006, low SES results were stable until a further fall in 2015. Reading results for provincial and remote area students declined significantly from 2000 or 2003, although results for remote area students increased in 2015.
The pattern of decline in mathematics and science was broadly similar for high SES, low SES, and provincial students. Mathematics results for remote area students declined significantly to 2012, but increased in 2015 and there was little change in science results. Indigenous results in mathematics and science have declined since 2009, although there was a small increase in mathematics in 2015.
2. Students below minimum standards
In 2015, nearly one-fifth of Australian 15 year-olds did not achieve the international minimum standard in reading and science and over one-fifth did not achieve the mathematics standard. There was a sharp increase in the proportion below the reading and science standards in 2015 from 14 to 18 per cent while the proportion below the mathematics standard increased from 13 per cent in 2006 to 22 per cent in 2015.
NSW, South Australia, Tasmania and the ACT had the largest increases in students below the reading standard with increases of eight to nine percentage points while NSW and Tasmania had similar increases below the science standard. Western Australia, South Australia and Tasmania had large increases in students below the mathematics standard of 10 to 14 percentage points. Victoria had the smallest increases of any jurisdiction, with an increase of two percentage points in reading and mathematics and no change in science.
Very large proportions of disadvantaged students were below the international minimum standards in 2015. About one-third of low SES students were below the reading, mathematics and science standards, and slightly fewer remote area and LBOTE students were below the standards. Almost half of all Indigenous students were below the mathematics standard while 40 per cent or more were below the reading and science standards. One-quarter or more of provincial students and one-fifth to one-quarter of students in the 2nd lowest SES quartile were below the standards. In contrast, only seven to nine per cent of high SES students were below the standards.
The percentage of students below the minimum standards increased for all groups from 2006 to 2015. Low SES, provincial and LBOTE students had the largest increases across all subjects with increases of six to 13 percentage points. High SES students had the smallest increases with increases of two to four percentage points.
3. Students at advanced levels
There has been a declining proportion of students at the most advanced levels since the PISA assessments began. The fall in the proportion at the highest reading levels occurred between 2000 and 2006. There were sharp falls in the proportion at the highest mathematics levels in 2006 and in 2015 and a similar fall in science in 2015.
The ACT, NSW and Western Australia had the highest proportions at the most advanced levels. There were large declines in the proportion of students at the most advanced reading, mathematics and science levels in Western Australia and the ACT and a large decline in the proportion at the most advanced mathematics level in South Australia.
Very small proportions of low SES, Indigenous and remote area students were at the most advanced levels in reading, mathematics and science. There was little change in the reading proportion since 2006, but significant declines in mathematics amongst higher SES, provincial, foreign born and Language Background Other Than English (LBOTE) students. Provincial and foreign born students had the largest declines in the proportion at the most advanced science levels.
4. Achievement gaps
The achievement gap between high and low SES students is equivalent to about three years of learning and the gap between high SES and Indigenous is about four years of learning. The high SES/remote area student gap is about three years of learning; the high SES/LBOTE gap is about two years; the high SES/foreign-born immigrant gap is about 18 months and the high SES/first generation immigrant gap is a bit over a year. The gap in mean science scores between high and low SES schools is over three years of learning.
The achievement gaps have largely decreased since the PISA assessments began, with significant reductions in the gaps between high SES and Indigenous students in reading and mathematics and a large reduction in the reading gap between high SES and remote area students. However, the reductions are a result of the declines in scores for high SES students rather than improvements in scores for disadvantaged students. The gaps between high (95th percentile) and low (5th percentile) performing students have increased significantly since 2006.
5. Factors influencing declining results
It is difficult to make strong conclusions about the factors and causes behind the decline in results. The widespread incidence of the decline across states, school sectors, ability groups and various demographic groups together with the variation in the extent and timing of declines suggests that several factors have contributed to the declines and that they impacted differentially. However, there appears to be no clear-cut explanation for the decline.
Some quarters have been quick to blame teachers. However, there is no robust evidence that the quality of teaching has declined over the past 15 years. Australian teachers remain highly qualified by international standards and utilise effective teaching practices for the most part, although there is room for improvement.
Admission standards to pre-service teacher training have fallen and the proportion of entrants achieving an ATAR score of 60 or below has increased significantly. It may be that the quality of undergraduate teaching courses has declined along with entry standards as there has been widespread concern about the variable quality of pre-service training over many years.
Despite many reports on teacher training over the past 15 years, there is no clear evidence that the quality of teacher training courses has declined so significantly as to be a key factor contributing to declining student performance. However, it is difficult to assess the quality of teacher training in Australia because reliable and representative data about current practices and outcomes in teacher education is not available.
Student disruption in Australian classrooms is high by OECD standards across advantaged, disadvantaged, rural, town and city schools and it appears to have worsened over recent years. It could be a factor in the declining results. An OECD statistical analysis attributes it a moderately significant impact on student results. However, student behaviour is a complex issue and the extent of poor behaviour may also be a manifestation of low levels of achievement.
The most likely teaching-related factor affecting student performance is the high proportion of teachers who, because of shortages of qualified staff, are teaching out-of-field across all states, school sectors, socio-economic status groups and school locations. The shortage of mathematics and science teachers is very high by international standards, with about one-quarter to one-third of schools having difficulty in recruiting suitably qualified teachers. As a result, a large proportion of teachers are teaching outside their field of expertise. In 2013, 26 per cent of teachers in Years 7-10 in 2013 were teaching subjects in which they had not specialised and 37 per cent of early career teachers in Years 7-10 were teaching out-of-field. This should be a major concern, irrespective of the trend over recent years. It suggests considerable scope to improve student performance.
There was an increase in the shortage of science teachers over the past decade and this may have contributed to the decline in science results, but the shortage of mathematics and English teachers has decreased a little over the period. It is therefore difficult to attribute the decline in reading and mathematics to increasing shortages of qualified teachers.
Student absenteeism is a significant factor in low student achievement. A very high percentage of Australian students appear to skip school and arrive late for school. In 2015, 29 per cent of Australian students skipped a day of school in the two weeks prior to the PISA assessments compared to two per cent in Japan and Korea, three per cent in Taiwan, 14 per cent in Singapore, 18 per cent in Canada, and the average of 20 per cent across OECD countries. The percentage skipping school in Australia was the 7th highest out of 35 OECD countries.
Student absenteeism is likely to be a key factor in Australia’s poor performance, but there is insufficient data available to assess its effect on student performance over time. It is an issue that should be investigated more closely because there are many reasons why students skip school.
Another factor may be teenage student attitudes to tests whose results have no consequences for their school and later careers. There is anecdotal evidence that students are increasingly adopting a blasé attitude to standardised tests, but there is no systematic survey evidence over time.
6.Factors influencing low performance and high inequity
While it is difficult to determine the factors that have contributed to the decline in Australia’s PISA results, a stronger association is apparent between several educational factors and the continuing low achievement amongst disadvantaged students and the persistent large achievement gaps between disadvantaged and advantaged students and schools.
Statistical analysis of Australia’s performance by the OECD shows that student and school socio-economic background have by far the biggest effects on school results and the effect is much larger than the average for the OECD. Increasing poverty in Australia is likely to be contributing to continuing poor performance by low SES, Indigenous and remote area students.
The PISA 2015 data shows that the extent and quality of educational resources devoted to disadvantaged students and schools is considerably less than those available to advantaged students and schools. This is certainly the case with teaching resources. For example, low SES and rural schools have fewer teachers with a university degree and a major in science than high SES schools. Teacher shortages are much larger in low SES, rural and town schools than in high SES schools. The gaps are the largest in the OECD.
Effective teaching strategies also seem to be used less in low SES schools than in high SES schools and students in disadvantaged schools have less access to a rigorous mathematics and science curriculum. Low SES schools also have greater shortages in educational materials than high SES schools.
Student absenteeism is very high amongst low SES and other disadvantaged students. In 2015, 34 per cent of students in low SES schools skipped a day of school at least once in the two weeks prior to the PISA test. This was the 6th highest rate in the OECD. Also, around 30 per cent of students in rural and town schools skipped a day. High percentages of these students also arrive late for school. Some 47 per cent of students in low SES schools arrived late for school in the two weeks preceding the tests and around 40 per cent of students in rural and town schools arrived late. Undoubtedly, absenteeism is a factor in the lower average achievement of these students.
Increasing funding for disadvantaged schools is critical to overcoming the shortages of qualified teachers, less use of effective teaching strategies, reduced access to key areas of curriculum, shortages of educational materials and high rates of student absenteeism. Many academic studies show that increased funding for disadvantaged students and schools improves school results. Successive PISA reports, including the latest, have concluded that student performance is higher in education systems that devote more resources to disadvantaged schools than advantaged schools.
Australia’s PISA results have fallen significantly over the past 15 years, but it remains one of the high performing countries in reading and science. However, its mathematics results have slipped to about the average for the OECD.
While the PISA results for 15 year-old students have declined significantly, Year 12 results in terms of retention rates, completion rates, proportion of students achieving an ATAR score of 50 or more have all improved significantly over the past 10-15 years. As a result, more 20-34 year-olds have achieved an upper secondary education than ever before.
The sharp contrast between the trend in results for students two year levels apart presents a conundrum. The PISA results may partly reflect the difference in student attitudes to the PISA tests, which have no personal stakes attached to them, and the Year 12 assessments. The latter have a major influence on the future paths that students take after leaving school. The contrast in results only two year levels apart does caution against using the PISA results as the sole benchmark to assess the student performance.
Apart from possible differences in students’ attitudes to the PISA tests, shortages of teachers in mathematics and science, the large proportion of teachers teaching out-of-field, student absenteeism, and the variable quality of undergraduate teacher education are possibly the main factors contributing to declining results. However, further investigation is needed.
Government and other claims that funding increases haven’t delivered better student performance ignore the fact that the increase in total government funding per student, adjusted for inflation, over the past 15 years was very small and that the very large part of it was misdirected to the more advantaged private school sector rather than to public schools which enrol the vast majority of disadvantaged students. The latest PISA results for disadvantaged students re-affirm the need for the full Gonski funding plan to be implemented.
The long-term decline in Australia’s PISA results, the high proportion of disadvantaged students not achieving international minimum standards and the continuing large achievement gaps between advantaged and disadvantaged students also demands serious review. A full independent public inquiry into Australia’s school outcomes and the high inequity in outcomes should be established by the national Education Ministers’ Council.