Fighting for Equity in Education

The struggle is long but hope is longer

Big Increase in Students Withdrawn from NAPLAN Tests

Thursday November 22, 2012

An increasing number of parents are withdrawing their children from the NAPLAN tests. There has been a four- to five-fold increase across Australia since 2008 in the percentage of children withdrawn from the numeracy tests. Withdrawals have increased in all Year levels tested and across all states and territories, with the largest increases in the ACT, Queensland, South Australia and Victoria.

It is not clear whether the increase is due to increasing parent concerns about NAPLAN, increasing rorting of school results or a combination of factors. Certainly, more and more parents are becoming aware that NAPLAN is not compulsory despite the efforts of education authorities to suggest they are mandatory. Certainly, schools are under tremendous pressure to improve their results and there is anecdotal evidence of schools encouraging parents of lower achieving students to withdraw them from the tests or keep them home on test days.

Although the percentages of students withdrawn are still small, the rapid growth poses a threat to the reliability of NAPLAN results for inter-school comparisons, inter-jurisdictional comparisons and trends in indicators of student achievement. This threat has been highlighted by the COAG Reform Council.

In the ACT, the percentage of Year 3 students withdrawn from the numeracy tests increased from 0.8% in 2008 to 4% in 2012 [Chart 1]. In South Australia, the percentage withdrawn increased from 0.6% to 3.3%; in Queensland it increased from 0.3% to 2.4% and in Victoria from 0.1% to 2.4%. For Australia, it has increased from 0.5% to 1.9% – a four-fold increase. The smallest increase was in NSW where the percentage withdrawn increased from 0.8% to 1%.

There was also a big increase in Year 9 students withdrawn. The percentage withdrawn from the Year 9 numeracy test across Australia increased from 0.3% to 1.4% – a five-fold increase [Chart 2]. In the ACT, it increased from 0.3% to 2.1%; in South Australia from 0.2% to 2.8%; in Queensland from 0.5% to 2.8%; and in Victoria from 0.1% to 1.3%. The increase in NSW was negligible – from 0.4% to 0.5%.

There are significant differences between the states and territories in the percentage of Year 3 students withdrawn from NAPLAN, ranging from 4% in the ACT to 1% in NSW. The ACT and South Australia (3.3%) had the highest percentages of students withdrawn from the Year 3 numeracy tests in 2012 while NSW, Tasmania (1.3%) and Western Australia (1.3%) had the lowest percentages withdrawn.

State/territory differences at the Year 9 level are smaller than in Year 3. Queensland and South Australia had the highest percentages of Year 9 students withdrawn – 2.8% and 2.3% respectively. NSW, Tasmania and Western Australia had only a very small percentage withdrawn – 0.5% to 0.7%.

The increase in students withdrawn accounts for a small, but significant declining trend in the percentage of students sitting the NAPLAN tests since 2008. The percentage present for the Year 3 numeracy tests across Australia fell from 94.6% in 2008 to 93.1% in 2012 and from 91.8% to 89.8% in Year 9 [Charts 3 &4].

In contrast to the trend in students withdrawn from NAPLAN there has been little change in the percentage of students absent on test days and exempt students since 2008.

The percentage of students exempt from the tests shows little change. In Year 3 numeracy, it increased from 1.7% to 1.9% for Australia and from 1.1% to 1.6% in Year 9 [Charts 5 & 6]. The biggest increases were in NSW where the percentage of Year 3 exempt students increased from 0.9% to 1.7% and from 0.5% to 1.3% in Year 9.

There was also little change in the percentage of students absent on test day. In Year 3 numeracy, the percentage absent for Australia declined slightly from 3.3% in 2008 to 3.1% in 2012 while it increased slightly in Year 9 from 6.8% to 7.2% [Charts 7 & 8]. Absent students comprise the large proportion of students not sitting the NAPLAN tests. For example, 7.2% of Year 9 students in Australia were absent from the numeracy test in 2012 while 1.4% were withdrawn and 1.6% were exempt.

In 2012, 9% of Year 9 students in Australia were either withdrawn or absent, with a range from 7% in NSW to 11% in Tasmania and 17% in the NT. Five per cent of Year 3 students in Australia were either withdrawn or absent, ranging from 3% in NSW to 7% in the ACT and South Australia and 14% in the NT.

There is concern in official circles about decreasing participation in NAPLAN. Last year, the Ministerial Council for Education, Early Childhood Development and Youth Affairs commissioned work on participation rates by a strategic policy working group with the Australian Curriculum Assessment and Reporting Authority (ACARA). The report of this group should be completed by the end of 2012.

The reason officials are becoming concerned about the trend is that increasing numbers of students being withdrawn or absent from NAPLAN will affect the reliability of the results (exempt students are included in the NAPLAN results by being deemed to be below minimum national standards). Changes in participation rates could affect the results of individual schools, sub-groups of students such as Indigenous and low socio-economic status students and state/territory results as well as trends over time.

In its report on education performance in 2010, the COAG Reform Council emphasised the importance of high participation in NAPLAN for the reliability of results. It said:

In order to accurately report literacy and numeracy achievement, it is important that as many students as possible sit the NAPLAN tests. Small differences in participation may affect literacy and numeracy achievement because the ability of students who do not participate is likely to differ from students who do. [p. 22]

The impact on individual school results will depend on the background of the students who are withdrawn or absent. The withdrawal of lower achieving students will increase a school’s average result. As more students are withdrawn over time a school’s results will be artificially boosted.

Research recently published by the COAG Reform Council shows that non-participants in NAPLAN tend to be lower scoring students. ACARA uses statistical imputation techniques to estimate test scores for absent and withdrawn students to reduce bias in state-wide comparisons of results. The COAG Reform Council research drew on this data to show that students who sat for NAPLAN have higher mean test scores than those of withdrawn and absent students. In NSW, the difference between Present and Absent student means in numeracy in 2011 was 20 points in Year 3 and 39 points for Year 9. In Victoria, the differences were 13 points for Year 3 and 24 points for Year 9. The Year 9 differences are significant, being equivalent to nearly two years of learning in NSW and one year in Victoria. The differences between Present and Withdrawn students were much smaller with negligible differences in Year 3 and 15 points in Year 9 in both NSW and Victoria, the latter difference is equivalent to about six months of learning.

As participation declines, the reliability of the average scores of individual schools will also decrease particularly in small schools where the statistical uncertainty or error band around the average score is relatively large because of the small numbers of students sitting the tests. This increases the unreliability of school rankings and league tables as a guide to school quality.

There are also implications for state-wide comparisons and trends. Lower participation rates mean that more and more test scores are imputed by ACARA and this is not as accurate as having students participate in the tests. If a state or territory has a low level of participation then more scores in that jurisdiction will be imputed and this could bias inter-jurisdictional comparisons and trends.

The research commissioned by the COAG Reform Council recommended that further work should be done on examining the impact of non-participation in NAPLAN on trends in achievement and the comparability of results across jurisdictions. It found that variability in participation rates may have “a substantially important impact” on jurisdictional comparisons of NAPLAN results and that their impact on achievement trends is “potentially of concern” [p. 15]. It also said that examination of the method of imputing test scores is warranted because it is statistically complex and the impact that it may or may not have on the indicators is not immediately clear.

To summarise, data on participation rates in NAPLAN clearly shows that parents are increasingly exercising their right to withdraw children from the tests and a significant proportion of students are absent on test day, especially secondary students. While the absolute percentages of students withdrawn or absent are small, the likelihood is that it will continue to increase as more and more parents become aware of their rights. Declining participation will affect the reliability of published school results, inter-school comparisons and league tables; trends in national and state/territory achievement; and inter-jurisdictional comparisons of results. There is evidence of growing official concern about the reliability of NAPLAN results.

Trevor Cobbold

Charts on the Withdrawal of Students from NAPLAN Tests.pdf

Big Increase in Students Withdrawn from NAPLAN Tests.pdf

Search

Contact us

News feed from Save our Schools

SOS Twitter

Chris Bonnor's Education Media Watch

Sections

Links


Education news


Education blogs


Education information


Public education


Policy Briefs


Research

Speeches


Submissions