AI in education isn’t a crisis, it’s an indictment of the whole thing as a means to an end

Image composite. Big Ben collapses while an Android is laid on its spire.
(Image credit: Luke Hughes)

Over 1,000 UK undergraduates have been surveyed by the Higher Education Policy Institute (HEPI) and 53% admitted to also using popular AI tools like ChatGPT or its innumerable imitators to create content, generate ideas, or both.

The Guardian phrases the next bit perfectly, so I’m going to quote them on it: “Just 5% admitted copying and pasting unedited AI-generated text into their assessments.” 

Right, so maths isn’t the strongest point of anyone jumping to a froth-at-mouth conclusion, but that’s at minimum 50 students and almost definitely less than 100. True, this is one study’s sample size, but it’s also a big one.

This also isn’t the first time studies like this have been done and prompted rethinks in how to secure academic integrity in the age of AI. But if AI can trample all over university degrees and courses, doesn’t that mean that they’re no longer fit for purpose? Shouldn’t educators adapt?

 Adapting to AI in education

Well, they might be doing so. I’m reading a Wired article (paywall) from just over a year ago at time of writing, and the understanding of AI’s role in plagiarism is slightly depressing: a lot of equivocation over ‘hmm, if a computer generated the content, is it a plagiar’, and not really understanding that ’AI’, as we know it in this context, is just a computer that’s been force-fed a human-produced (and often itself copyright-infringing) corpus, not a literal sentient being.

But the educators quoted in the Guardian article seem pretty switched on, try the following:

“I’ve implemented a policy of having mature conversations with students about generative AI. They share with me how they utilise it,” [Dr Andres Guadamuz, Intellectual Property Law Reader, University of Sussex] said.

UK educators are also benefiting from the existence of AI. The Guardian writes that 58 secondary schools have been enlisted into a research project by the Education Endowment Foundation (EEF) wherein teachers will AI-generate lesson plans.

The report says nothing of how lecturers are taking to it, but I think it’s likely that they are, given that members and representatives for the two main higher education unions in the UK, the University and College Union (UCU) and UNITE, have been locked in battle with universities over pay and working conditions since I was a student, and it looks like it’s about to kick off again. Anything to lighten the load.

All of this sounds a hell of a lot more compassionate than hyping AI up to be the Antichrist, and threatening students with a stain on their academic record without any attempt to, er, educate students about what AI is or does.

That, at least, seems to be the overarching tone of that old Wired piece, despite the anecdote from a real-life breathing student talking about how poor ChatGPT is at producing engaging, let alone informed, academic material, so they wouldn't use it anyway.

Personal anecdote break

I could get drummed out of the magic circle here, but officially, at Future PLC, TechRadar Pro’s parent company, I’m a Graduate Junior Writer. My having gone to university, in a time before artificial intelligence, is basically the reason I get to register industrial-strength opinions that make no discernible difference to the way things are.

I’m also a pretty solid opponent of generative ‘artificial intelligence’. By and large, it’s a way of laundering copyright infringement, diluting the work of individuals, and making things up as it goes along to make a kind of tasty swiss cheese prose. Bad actors (including, er, the HEPI study) call this last one ‘hallucination’, but I think I’m going to call it ‘lying’.

Where written content generation is concerned, Future PLC investigates AI use and disciplines when uncovering plagiarism. Yet now I find myself in a strange predicament of… not caring, about AI use? At least in the realm of education.

Representation of AI

(Image credit: Shutterstock)

 I don’t care if students use AI to get a degree

Enticing heading, but it’s not because I’ve received a dark money payment in the last thirty seconds to make me now bang on about how AI is the future or whatever, it’s because AI’s net good has been proving that the education system, and the way in which perceived by the working world is broken.

We ran a story this week about how a majority of those young people they have now are struggling for job experience. I’ve personally faced this. Even getting this ‘graduate’ role was, I believe, more down to my relevant job experience, which I absolutely debased myself to get, than the actual piece of paper I got from my university for my tens of thousands of pounds and unceasing toil. 

Reading it incensed me, and reminded me of the following maxims, as decreed by civilisation.

All of this to say, the university degree has become so worthless, yet such a prerequisite of modern working life, that not only do I not care about the most egregious uses of AI in higher education, I’m actually somewhat saddened that the number of students engaging in that kind of use aren’t higher.

AI use by students in assessments indicts university courses as being dull as dishwater, and too expensive for what they are, more than it does students for being hardened academic criminals.

Some students don’t test well, or learn differently, or are just here because, of course, you need a degree to get a job. That was a ‘round peg in a square hole’ scenario even when higher education was more accessible, but now institutions are putting students in the same situation while also placing more financial constraints on them.

Given this, I would therefore suggest either:

a) just giving the student the piece of paper for God’s sake so they can get on with their life.

b) starting to phase out ’you need a degree to work’ as a culturally embedded principle if you want people in work regardless, which you do

c) Overhauling the assessment process such that it caters to multiple learning styles and dares to actually be interesting, which would also thwart ‘the rise of artificial intelligence’, or whatever.

My experiences of how distinctly unbothered employers and educators alike are by degree content and structure leads me to believe that, if I were able to have used AI at university, my life would not have been changed in any meaningful way, other than vastly decreasing the sheer amount of spinal fluid wrung from me to get here.

AI, like everything that’s made it into the zeitgeist at the behest of a nebulous, financially-motivated actor, is nightmarish and a cesspit. However, the education system is also a nightmarish cesspit, and AI has helped reveal that. 

In this one particular scenario, education AI doesn’t need regulation, it’s just doing what it’s supposed to: regurgitate and bluff back at you. If that’s enough to qualify what undergraduates do anyway (I’ve been there, it was, and it is), and thus short circuit higher education as we know it, then AI, for once, is not the problem, and the kids might actually be alright.

Workable solutions do exist

To be constructive in offering solutions more realistic than ‘reverse decades of the commercialisation of higher education via legislation with more legislation’, I do have some ideas.

Start by taking the rot(e) out of how assessments are delivered in favour of a wider variety of projects, and focussing on course content and delivery methods so that students actually want to engage with the assessment material. I concede however, that this would still require ministers, secretaries, and university staff dutifully insistent on shooting themselves in the foot alike to admit that they are wrong.

This sounds combative, but I should be fair. One senior figure in higher education who makes a solid argument along these lines is Professor Dilshad Sheikh, Deputy Pro-Vice Chancellor and Dean of the Faculty of Business at Arden University.

She says that Arden, a blended and online higher education institution, is taking steps away from punishment to education when it comes to AI use.

“Arden University argues that instead of punishing students for using such technology in all circumstances or trying to train lecturers to notice the signs of AI-generated content, they should be teaching students how to use it to help enhance their work and processes. The university is, therefore, exploring how best to integrate AI into learning, teaching and assessment strategies, recognising that a positive pioneering approach to AI is more beneficial to students.”

“Many other universities are focusing on plagiarism and how AI chatbots give students the opportunity to cheat on assignments. However, the reality is that the technology cannot replicate understanding and application of knowledge in authentic assessments, which is how we design our courses. The truth of the matter is that times are changing, so how and what we teach should change too.”

“AI will continue to get smarter to make our lives easier. We are seeing more and more businesses embracing such technology for the betterment of their growth, so why should we punish our students for using the same software being used in the real world?”

AI and the real world

This last point is pretty interesting, and one that I hadn’t really considered until now. AI is being laundered into workplaces as a productivity tool, but its pitfalls are surely the same as in education, as Future PLC has seen.

True, I’ve made no secret that I don’t use AI and take a pretty dim view of the whole thing. But using AI responsibly - for prompts, for ideas, rather than for content - and evangelising that kind of use in a learning environment, is perhaps making the best of a bad situation. 

And, evidently, small but vitally important moves are being made from all sides in the UK’s higher education system to educate and engage critically with AI’s unsuitability to produce excellent, insightful academic work, as well as push for change in how degrees are taught and thus re-engage students.

It’s a good sign that the student-university transaction, though still a transaction at all, mandated by many workplaces in this country at this time, could be about to become more valuable to students, the people who benefit the most from it. 

And then - who knows? We might just stop having to read about lecturers getting mad in national newspapers that their assessments not only can be passed by a computer literally making it up as it goes along, but that students are disengaged enough to prefer all of that of applying themselves. With higher education in the state it’s in, I still don’t blame them.

More from TechRadar Pro

TOPICS
Luke Hughes
Staff Writer

 Luke Hughes holds the role of Staff Writer at TechRadar Pro, producing news, features and deals content across topics ranging from computing to cloud services, cybersecurity, data privacy and business software.