The newspaper article lamenting the decline in standards by Australian students in PISA tests should not come as a surprise to Australian teachers. As a colleague of mind once said: Students don't get taller because you measure them more often. Improved test results come from quality teaching and learning, not testing. How have we reached this situation where the governments of Australia are investing in more accountability and creating high stake tests but the standard is falling?
Politicians have always wanted to know: Are the students doing better today than yesterday? The bureaucrats were never able to answer this question with hard data because their tests were all norm-referenced. What was needed was standards-referenced testing and the Basic Skills tests in NSW were the first to take up this challenge using Item Response Theory to provide questions that could be put on a common difficulty scale and would allow comparison of results over time.
As more governments became interested in this method and its ability to not only provide teachers with diagnostic information (of little interest to the politicians and bureaucrats) more tests appeared until each state was operating some form of basic skills testing. It was only a matter of time before such testing became national. Now we have the NAPLAN tests at Years 3, 5, 7 and 9 and the data along with other school data is used to highlight "successful" schools and make comparisons howbeit with like schools.
So what is wrong with this? In one sense nothing – the tests are well constructed on the best standards-referenced testing procedures and provide useful comparative information. What problems might then arise?
Rachel is up early for a Saturday morning and has to have breakfast before the rest of the family. She is going to school at a non-government school in the inner-west of Sydney. Fortunately, she does not have to wake the other members of her family because she can walk to school. She wears school uniform of course for this special three hour class. At her school there is almost 100% attendance from Years 3, 5, 7 and 9 to practice and be skilled in the NAPLAN tests. Already her teachers have been focusing on the questions likely to be tested revising the most expected items. Now these extra classes, just till the time of the tests will provide extra help to make this school look good to the community it serves.
For Rachel it’s a pain to have to go to school on Saturday and to do over and over the same work which she already knows. But the school insists.
Does this form of practice distort results? I don’t know; but it would be a worthy project for researchers.
In my previous life in the NSW Department of Education and Communities I had responsibility for the introduction of the first primary Human Society and Its Environment (HSIE) syllabus from 2000. One of the implementation problems was that some principals, caught up in the focus on Basic Skills test results, declared that their school was a literacy and numeracy school and that there would be little curriculum time for HSIE. They had determined their priorities and it didn’t include HSIE or Science and Technology.
HSIE consultants faced a long battle with such schools and also some district superintendents who were pushing them.
The problem being highlighted here is that narrowing the curriculum to meet test expectations can have an effect on the total learning of students. They know more about a fewer things, even to the extent of over-learning, but have less general education. Their knowledge of the world is reduced and as a result they find less relevance in the work they do learn. Then when they sit for broader tests like PISA, even if in the same subjects, they do not do as well as expected. So governments are getting a mixed message. Schools are improving on the NAPLAN tests but falling behind in international tests. The reason is really simple: students’ broad general education is being sacrificed to the test god.
There will be many who will want to scrap the NAPLAN tests and others who will argue that we don’t need our students involved in international testing. Both these solutions would seem unlikely.
One solution might be in the development of the Australian curriculum and an expansion of the testing program. If testing is too narrow, and it is having the effect of narrowing the curriculum, then let’s expand the testing to cover the full curriculum. If we are to have a truly Australian curriculum then testing in other subjects must be possible using the well established methodology. Students will be able to access the entire curriculum and the sort of practice listed above will be far more difficult and become irrelevant.
Of course, the cost of administering such tests to everyone will be high but not everyone needs to do them. Perhaps the information governments need can come from sampling not only the new subjects to be tested but also literacy and numeracy. In fact, such a procedure would probably save money. How long can the government afford to test every student in Years 3, 5, 7, and 9. There could be a compromise solution where whole cohorts are tested in particular subjects on some form of schedule so that not every student in tested in every subject every two years. Because of standards referenced testing it is not necessary to follow a cohort, students at any point of time are measured against the standard.
Providing students will a full curriculum coverage is more likely to assist them in international testing but narrowing the curriculum to help students do better in narrow based NAPLAN tests is likely to continue the dichotomy of improving results in NAPLAN tests and declining international standards.