By now everyone will know of the erratic launch of My School, www.myschool.edu.au the effort by the Rudd Government to release standardised and test-based reports for every school in Australia. The launch, crash and subsequent political (rather than aeronautical) spin was fun to watch and made good media - but it was a sideshow that tended to detract from so many of the real issues.
Those able to successfully log on found a site with authoritative statements about the accuracy and fairness of the school data being used. Armed with this assurance they could then cast their eye over the figures and little coloured boxes to find out how any school compares with similar others, green for substantially above in reading, writing, spelling, grammar and numeracy, red for substantially below and various shades in-between.
The website, developed by the Australian Curriculum, Assessment and Reporting Authority (ACARA), doesn’t directly compare schools that are not similar - but it helpfully provides the names of 20 local schools so that you can do it yourself. But why bother? The league tables weren’t far behind and will be around for years to come.
The people at ACARA know very well that their data shouldn’t be used in this way and say on their website that any resulting simple comparisons of schools will be misleading. Almost in chorus the various education ministers have also denounced “simplistic league tables” - while still fully ensuring that anyone can easily construct them.
The debate about school league tables has raged backwards and forwards for some time: whether or not they distort the curriculum, encourage cramming and even cheating, improve schools and systems of schools, the problems in using student test scores to say anything useful about schools.
But what about the actual school reports on the My School website?
My School lists up to 60 statistically similar schools against which any school can be compared. The use of “statistically similar” and “like school” conveys all the assurances that any comparisons created from the data are going to be fair and reasonable.
Leaving aside all the statistical juggling, ACARA considers 16 factors which apparently have a strong correlation with student performance. All schools are given an index number and are then grouped according to their level of relative socio-educational advantage - which by itself is a league table - and this is supposed to enable meaningful and fair comparisons.
To create this Index of Community Socio-Educational Advantage (ICSEA) they use the socio-educational characteristics, not of each student’s family, but of their census collection district. The funding formula for private schools, hardly held up as a model of efficacy, is constructed in a similar way.
So in trying to account for the differences between schools ACARA is already off to a shaky start. But it needs to get everything right: when you get into the business of comparing schools, with all that this entails, there can be little margin for error - too much is at stake.
The biggest problem is that 70 per cent of the differences between schools are due to which students each school enrols, not what they actually do as schools. No less than Professor Barry McGaw, the head of ACARA has repeatedly stated this.
This means that any comparative index used by ACARA must take account of everything and anything that kids bring with them when they come to school and impact on their test scores. This includes such things as prior learning (from schools and home), their family income (not the average for their suburb), gender (yes, boys and girls are different), how recently they arrived in Australia and specifically where they came from.
Discuss in our Forums
See what other readers are saying about this article!
Click here to read & post comments.
21 posts so far.