NAPLAN causing anxiety?
Fairfax papers are today reporting new research from the University of Melbourne that suggests students are showing signs of severe anxiety both before and during NAPLAN tests.
Is this sort of student behaviour familiar to you? Do you think this is an accurate snapshot? Should some level of anxiety prior to testing be expected?
Is this sort of student behaviour familiar to you? Do you think this is an accurate snapshot? Should some level of anxiety prior to testing be expected?
NAPLAN: is the pain worth it? smh.com.au
CHILDREN are suffering stress-related vomiting and sleeplessness as some teachers drill them for months prior to the National Assessment Program - Literacy and Numeracy (NAPLAN), according to the first national study into the...
In regard to "But student results are useful for their former primary schools' assessment too" - well, no actually, since primary schools do not have access to the results of their former students once they are in year 7 at other schools. It would be a different story if year 7 NAPLAN results were made available by the NAPLAN Data Service to the previous schools of the students concerned ... but they are not. So, the primary schools have no way of knowing how effective they have been in value adding in the time between the year 5 testing and the year 7 testing.
In regard to the "the entire results are not used" - agreed. At the micro-detail level, the results on how students in a school or a particular class have performed on particular questions or question types can be very useful. One of the biggest issues cited by teachers at the coalface wanting to use such data to meaningfully adjust their teaching programs to respond to areas of weakness in student performance is that results are not available for almost six months after the testing - which gives limited time to make sense of the data, adjust teaching plans and schedules accordingly and work with these students (at least, in the same calendar year). The many teachers I work with overwhelmingly agree that the plan to have the testing done online and the possibilities for immediate feedback to schools on student performance that will create will be a very welcome step forward.
In the meantime, as I stated in my original comment, bald and misguided use of snapshot data to create league tables of the type we have seen thus far merely distracts from the real issues - and these are, firstly, to ascertain the level of effectiveness of individual schools in value adding to their students understandings and skills over time and, secondly, to provide them with the detailed data to take steps to adjust their teaching programs in a 'continuous improvement' process (so they can, in effect, further enhance the value adding they provide).
In regard to "staffing issues the tests have highlighted", at the risk of sounding like a broken record, the only staffing issues that could possibly be highlighted with any validity would be issues predicated on the LONGITUDINAL data for any particular cohort of students being scrutinised (pointing to the degree of value adding that has happened in the school concerned in the two years between tests - e.g. in the time between when the students were tested in year 7 and when the same students were tested in year 9).
It's not rocket science: you test kids to see where they are at, you teach them, then you test them again to see what progress/incremental change has occured. It is not the absolute score that tells us anything about the efficacy of the teaching, its the size of the increment/change.