League tables and game-playing are encouraging pupils to choose 'easier' subjects at A level, argues Carol Taylor Fitz-Gibbon
Minister for educational standards David Miliband has claimed that there is no evidence that some A levels are more difficult than others. He has not done his homework.
Research carried out at Durham University in the 1990s clearly demonstrated that maths, sciences and foreign languages yielded lower grades than other subjects when like was compared with like. The findings were checked for Sir Ron Dearing's review of qualifications for 16-19 year olds and none was rejected. More able students chose mathematics, sciences and foreign languages, and many did well, but on average they obtained lower grades than would have been the case if they had taken other subjects. For example, 70 per cent of those doing A-level sociology were below average at GCSE, whereas 70 per cent of those doing A-level physics were above average at GCSE.
The situation is unlikely to be different when we complete the 2003 analyses of A levels. The findings bring into question the fairness of the system for students, schools, universities and the country as a whole.
University admissions tutors have long been aware of the differences between subjects and would, for example, generally accept students into physics with lower A-level grades than would be acceptable in English.
With the publication of league tables based on raw data, however, schools now stand to gain from encouraging students to take A levels in which, on average, they will obtain higher grades. Equally, students themselves may be looking at their chances in different subjects. Whatever the mechanism, the result is that growing numbers are taking easier subjects and fewer are taking mathematics, sciences and foreign languages.
If results have to be published in league tables, the only remotely fair comparisons that should be made among schools are of progress ("value added"), not raw examination results. Differences between raw examination results depend heavily on the intake to the school.
Indeed, differences between schools drop by about half when value added is used rather than raw results. In addition, data should be published for a three-year period, as happens in Scotland, to show the inherent variation from year to year.
Even worse than not publishing value-added measures are wrongly calculated value-added scores. Calculations by the Department for Education and Skills assume that all subjects are equally difficult. This particularly benefits schools in which students take many easy subjects.
To right this wrong, we wrote to those schools that were wrongly told their students had made less than average progress, and we allowed schools to publicise the correct analyses.
It is unlikely that limiting the number of subjects "counted" will put this right. The statistical model must allow for the severity of grading to differ between subjects.
It will be difficult for the DFES and Mr Miliband to continue to pretend there is no variation in severity of grading/subject difficulties. The government would, naturally, like to claim that higher grades reflect its efforts to drive up standards. However, it is clear that only in primary school numeracy has there been improved achievement - as seen in benchmark baseline tests taken on entry to secondary schools.
That standards have not risen but fallen at A level is particularly clear in mathematics. A 2000 report by the Engineering Council, which brought together 60 universities, many of which had been giving the same mathematics test to incoming students for over a decade, found "a serious decline in students' mastery of basic mathematical skills and level of preparation for mathematics-based degree courses".
A word about the growing popularity of psychology, about which some regrets have been expressed. Psychology courses may vary in content but the best will include a required set of courses in quantitative research methods and reliable and valid measurements.
Indeed, were a good psychologist leading the Qualifications and Curriculum Authority he or she would ensure that:
* Papers from one school were not all marked by one marker
* Bias was made impossible by having the examination papers marked "blind"
* The inevitable uncertainty in awarding grades was measured and reported
* The markers were not known to the schools.
I hope that many of the new generation of psychologists go into politics.
We might then have people who can set up sound systems and help the minister to do his homework.
Carol Taylor Fitz-Gibbon is director of the Curriculum Evaluation and Management Centre, University of Durham. Details: www.cem.dur.ac.uk
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to THE’s university and college rankings analysis
Already registered or a current subscriber? Login