Pity the poor parent
Disclaimer: This article is the opinion of the author and does not necessarily reflect that of all Teachers' Christian Fellowship members.
Recent media has focussed on criticisms of the NAPLAN tests. These criticisms include:
- too much teaching to the tests
- too much pressure on the students
- the tests only provide information on a narrow range of learning
- the curriculum is being distorted by the focus on literacy and numeracy
- the results over time show little if any improvement for a huge cost
- teachers are rejecting the results as indicators of future teaching
- different conditions (handwritten and computer input) distort results
to name but a few.
Well, if these are the criticisms why do we still have these tests. The answer is that they provide the only Australian wide indicators of student performance over time and the politicians and educational bureaucrats want to know: Are the students doing better today then yesterday? Without NAPLAN, answers are anecdotal and that’s not good enough for these stakeholders.
The problem is that any significant change in student performance over time for such a huge cohort can only be marginal. Before NAPLAN, the NSW Basic Skills tests demonstrated just how hard it was to get significant improvement in average scores with a large cohort. Some schools improved, but others went backwards. Depending on the abilities of different year cohorts, results in the same school varied from year to year, but good teaching always had an impact. Smart schools used the information to diagnose areas of teaching weakness and worked on these matters with students to raise performance.
Contributing to this lack of improvement is the construct of the tests themselves. They are not standardised tests as this term is used with IQ tests and the United States grade point average and associated tests. NAPLAN are standard-referenced tests in which questions are trialled and put onto an achievement scale, according to how hard students founds them. So, the standards are founded in the students work and these standards are applied to the construction of the tests and the resulting performance. Students should be able to display improvement but there is a normative factor in the way the tests are constructed pulling results towards the long-term average (norm). Getting year by year improvement is difficult and a frustration for politicians and bureaucrats. But to do it differently would lead to all kinds of manipulation and results that would not be comparable. So, these stakeholders are stuck with the current system.
Politicians and educational bureaucrats are not the only stakeholders. Parents ought to be interested in school results and in particular the value-added data. This data shows how a cohort of students has progressed from Year 3 to Year 5 and from Year 7 to Year 9. It compares the school’s position in Years 3 and 7 with Years 5 and 9 against the national average, schools with a similar starting score and schools with a similar mix of students. Parents can tell how much the school has added value to the cohort and how that added value compares with the national average, schools starting on the same scores and schools of a similar student mix. At least parents can tell how much their school is improving student learning and compare that with other schools.
In city areas and large country towns, parents have some flexibility in choosing a school or moving their children to a different school. This data provides some, but obviously not all, the information a parent might want in choosing a school.
Much of the push for the scrapping of NAPLAN has come from within the education community, from teachers, principals, tertiary educators and the bureaucracy. While many parents don’t like the tests, removing them will fuel the division between public and private schools as claims and counter claims are made, it will leave parents in the dark about school comparability and allow poor performance to be hidden. As Christians we should be concerned about these matters. Accountability is important, (just ask the banks) and while NAPLAN tests are neither comprehensive nor conclusive, they are important indicators for all stakeholders.
As outlined in "Going backwards: 20+ years of a literacy and numeracy focus", the tests may be only a small part of the reason why Australian results on international tests are getting comparatively worse. There are other causes and the tests just show up the scale of the problem.
They are a messenger, don’t shoot them.