O for a draught of vintage! that hath been
Cool’d a long age in the deep-delvèd earth
— John Keats, Ode to a Nightingale
The Northfarthing barley was so fine that the beer of 1420 was long remembered and became a byword. Indeed a generation later one might hear an old gaffer in an inn, after a good pint of well-earned ale, put down his mug with a sigh: “Ah! that was proper fourteen-twenty, that was!”
— J. R. R Tolkein, The Grey Havens, from The Lord of the Rings
I don’t know about anybody else, but I could do with a draught of the vintage good stuff right about now. I am that old gaffer in the pub muttering: “They should being back POAE, they really should.”
In all probability, only Science teachers of a certain generation (translation: old farts like me) will recognise the acronym P.O.A.E.
For the youthful pups who now seem to comprise the majority of the UK’s teaching workforce, it stands for “Planning, Evaluation, Observing and Evaluating”, the “strands” (dread word!) by which we used to mark practical skills in the good old days of yore, when the world was yet young.
And truth be told, they weren’t all that good. It is only in comparison with more modern iterations that they achieve their near-mythic ‘fourteen-twenty’ status.
One of the jobs I have been studiously avoiding over the summer holidays is to mark a portfolio of Y10 students’ controlled assessment practical work. I am dreading it. The reason is, I have to use the worst mark scheme every developed in the entire history of humankind. Or before. Or, applying a rigorous Bayesian statistical analysis of relevant probabilities, since.
Accuse me of hysterical hyperbole if you will, but take my word for it: this mark scheme is a turkey that out-turkeys all the Christmas lunches served over the past two millennia.
Let me explain. What is the purpose of marking students coursework or controlled assessment? Wearing our summative, assessment-of-learning hats for a moment, the essence of marking in this context is to generate a number that indicates a student’s relative performance. Ideally, another professional marking the same student’s work would generate a similar number.
Using the old-style POAE scheme, I would have to assess a student’s work against 25 hierarchical criteria which would give a “best fit” number out of a maximum of 30 marks. (Boy, this sure is a fun post, isn’t it?) From memory, moderators would tolerate a disagreement of plus or minus 3 marks before adjustment.
Using the modern, rubbish mark scheme, I have to assess a student’s work against, by my count, 67 hierarchical criteria which give a “best fit” number out of a maximum of 64 marks. This takes a while, as I challenge anyone to memorise or internalise the mark scheme.
And the end result: is a mark out of 64 ‘better’ than a mark out of 30? Does it allow a finer discrimination between the performance of students?
In theory: perhaps. In practice: no. It is just another example of assessment-itis: “-itis” being the most appropriate suffix in this case as the entire system of assessment is, indeed, inflamed. More is, in fact, less.
As an example, under the old POAE-scheme, the P for Planning strand (dread word!) had 7 criteria and a maximum 8 marks. Using the new mark scheme, I mark the same set of skills which are now labelled as S for Strategy (“Mategy, Categy, Sategy”) and include two individual sub-strands (even more dread words!) with a total of 21 marking criteria and a maximum 16 marks. And . . . it doesn’t tell the student or the teacher anything that the older scheme did not.
It is, in my opinion, a badly-designed exercise in futility which provides no useful guidance or feedback for either student or teacher. Let it be sent forthwith to whatever corner of limbo that clapped-out assessment formats go to die. A curse upon it, and . . .
Sings to the tune of “My Bonnie Lies Over the Ocean”:
Bring back, bring back, O bring back my P-O-A-E, A-E!
Bring back, O bring back my P-O-A-E to me!