Dirty AI truths Part 1
Most people in higher education are missing the point or in denial
James
8/20/20252 min read


The most quaint and touching thing I've done this year is completing online declarations before taking university quizzes, certifying that "I am completing the quiz under exam conditions... and not using any materials to assist with answering the questions". It's touching how they think this would prevent a student from doing exactly that, or somehow absolve the assessment creators of blame or accountability, like a Catholic priest giving absolution.
Let me tell you how absolution feels. It feels like the warm embrace of ChatGPT.
We should talk about the very obvious elephant in the room. This isn't about open versus close book exams; it's about using AI. If you're reading this in the year 2035 (or come to think of it, right now), you'll immediately recognise how naive, bordering on risible, this 'honour system' approach is. King Canute is paddling for grim death against the tide.
The clever tertiary educators out there have already figured out a few things:
There's nothing inherently wrong with using AI. It's not cheating, any more than a woman with a calculator is 'cheating' on a maths exam at university. This needs to be reframed. Expecting the 'honour code' to prevent its use is asinine.
Good assessments must test knowledge application, transfer, and critical thinking / clinical judgement, rather than simple recall of 'facts'*. Actually, good assessments have always done this and poor assessments rarely do. So nothing new here.
If you truly need to assess whether a student has memorised and can retrieve specific information on demand, you'll have to do so in an exam room with invigilators.
The way in which we use AI is what needs (ethical) focus and attention. Throw open the doors, because - dear reader, I don't know if you're aware of this - the horse is long gone and the stable doors are swinging in the breeze. Do you think there are any kids out there who aren't using AI with their schoolwork? Encourage its use because it's already being used, no matter what you tell yourself in the darkness. Demand better AI engagement and application. As per usual, universities and legislators are behind the curve and need to be more agile in their responses.
There is an implicitly 'Spartan' approach to what's happening here. The 'hidden curriculum' at play here is: Don't get caught using AI.
I dedicate myself to the road anew. Death rides beside me.
*There is a level of nuance required here: Daniel Willingham's excellent book 'Why Don't Students Like School?' illustrates how deep comprehension and critical thinking are built on a bedrock of facts.