The Volkswagen Emissions Test Defeat Device needs no introduction:
Full details of how [the defeat device] worked are sketchy, although the EPA has said that the engines had computer software that could sense test scenarios by monitoring speed, engine operation, air pressure and even the position of the steering wheel.
When the cars were operating under controlled laboratory conditions – which typically involve putting them on a stationary test rig – the device appears to have put the vehicle into a sort of safety mode in which the engine ran below normal power and performance. Once on the road, the engines switched out of this test mode.
The result? The engines emitted nitrogen oxide pollutants up to 40 times above what is allowed in the US.
— BBC News 4/11/15
This perceptive post from cavmaths shows , I think, the danger of relying on widely used educational “best practice” short cuts. They can actually be deleterious to student understanding. In short, many of them are simply “educational defeat devices”, clever tricks designed to give a false impression of student performance under artificial test conditions, cheats that fall apart when tested in the real world.
“How the understanding is best conducted to the knowledge of science, by what steps it is to be led forwards in its pursuit, how it is to be cured of its defects, and habituated to new studies, has been the inquiry of many acute and learned men, whose observations I shall not either adopt or censure”.
–Samuel Johnson, The Rambler, April 1750
A colleague described a recent visit to a highly successful science department that has drunk mighty deep of the PIXL well. I shall summarise some of her observations and comments below. My reactions varied from intrigued to puzzled to horrified, but in keeping with the Johnson quote above, I shall endeavour to urge neither adoption nor censure — at least until I have thought about it some more.
Item the first: textbooks are forbidden. Students are taught from in-house PowerPoints and worksheets which are made available online for individual study by students. My colleague reported that she visited several classes in the same year group, and all the teachers were teaching the same topic with the same PowerPoint — and were often on exactly the same slide at the same time! Reportedly, this system was set up because science leaders were not satisfied with the quality of lessons being planned by individual teachers. For myself, I couldn’t help but be reminded of the Gaullist education minister who claimed to know which page of which textbook children throughout France would be studying on that very day . . .
Item the second: science leaders have exhaustively analysed the GCSE exam board specification to produce the materials mentioned above. Every learning point is translated into “student friendly” language and covered in detail. My information is that a typical starter activity might be for students to copy down a summary of important information from a PowerPoint, before practising application using worksheets and past paper questions. These are often peer marked. Since planning and resource making have been centralised, the workload of the classroom teacher appeared to be more manageable than in many schools.
Item the third: students are regularly tested. Test papers are gone over with a fine tooth comb by the science team and areas of weakness identified. These are addressed in large, multiclass study skills sessions led by the head of science in the assembly hall, teaching from the front (brave woman!) using an old fashioned OHP and transparencies! (Sigh! Now that takes me back: I can almost smell the banda machine solvent as we speak.) Students are sat at exam desks for the session, and the hall is supervised by teaching staff and SLT (including the headteacher on the day my colleague visited). This is followed by a “walk and talk” mock (i.e. the answer is modelled by the Head of Science on her trusty OHP), followed by individual exam practice under exam conditions.
And so we come to the question: shall we adopt or censure these observations?
The truth is: I am not sure.
On the one hand, I can see how this might be a rapid and effective way to improve results, especially in a school with an inexperienced science team. And the part of me that actually likes writing schemes of work and resources would relish the challenge of developing such a scheme. And I’m told that percentage science pass rates improved significantly from the low teens to the high eighties . . . over the course of a single year! And you can’t really argue with such success, can you? (Actually, yes you can — see this post on the Halo Effect) Also as Lt. Worf of the starship Enterprise once observed: “If winning isn’t important, why keep score?”
And yet . .
Part of me rebels at such regimentation. Is this an example of the “mcdonaldisation” of education, the continuing process of deskilling the classroom practitioner? I genuinely hate to say this, but given this model maybe Sir Ken Robinson has a point; although this particular iteration seems to owe more to Taylorism rather that the nineteenth century workhouse.
Use another teacher’s PowerPoint? Ugh! I’d rather be forced to use his toothbrush . . .
And, while I grant that many examination questions are indeed fit for purpose and thoughtfully designed to expose misunderstandings and misconceptions, I cannot help the feeling that our examination system has become an overly-powerful tail wagging an emaciated dog.
Is learning truly synonymous with exam success? Have we become so enamoured of the assessment of learning rather than learning itself that we, like the Scarecrow in The Wizard of Oz, would not consider ourselves truly learnèd unless we hold a diploma saying that we are?
I shall leave the final word to my friend Sam Johnson:
“The great differences that disturb the peace of mankind are not about ends, but means. We have all the same general desires, but how those desires shall be accomplished will for ever be disputed.”
— The Idler, December 1758