Four Reasons Not to Use ChatGPT in University Teaching


These are some reflections stimulated by a course I recently attended on AI in university teaching, where many of my colleagues spoke about getting students to work with the outputs of ChatGPT. There is an opportunity cost with these activities – time spent working with ChatGPT output is time that could have been spent analysing academic sources – so we need to think about whether it is a valuable use of teaching time. I am initially sceptical. These are my reasons.

Reason 1: ChatGPT will not solve the plagiarism problem created by ChatGPT

The most frequent reason I hear given for introducing ChatGPT in the classroom is that the students will already be using it to plagiarise anyway. Getting them to analyse ChatGPT output is seen as a way to stop them from cheating. But there is no reason a student cannot use ChatGPT to analyse the output from ChatGPT, just as they can apply it to any other text. So it is not clear how this is supposed to prevent students from cheating. Training students how to better use ChatGPT may increase cheating rather than reduce it.

Reason 2: ChatGPT produces crappy output

Currently ChatGPT produces passable but slightly crappy answers to scientific questions. I’ve sparingly tested it (see Reason 4) on a couple of questions and its answers are usually on the level of a decent but not excellent undergraduate student – for example, on descriptive questions it gets most things right, but often leaves out some key information. On evaluative questions it performs worse than that. It is prone to “hallucinations” and has a well documented tendency to create fake references. It does not approach the standard of a good academic article, so it is arguable whether it is a worthwhile use of students’ time to analyse this output rather than an academic source. I would not usually teach my course by forcing students to read the essays of middling students (unless it was a form of peer feedback to help the writer), so why would I make them work with output of a similar quality just because it is produced by AI?

Reason 3: ChatGPT has no respect for the norms of academic practice

A core value of academic work is that we treat our sources carefully. We analyse them for their credibility, and we attribute credit for prior work in precise ways. ChatGPT is fundamentally at odds with these values. It produces a generalised output that is parasitic on the prior work of human authors but aggregates their work in a way that makes it impossible for us to know what sources it is working with and how. The output itself also lacks credibility (e.g. the aforementioned hallucinations and fake references). Teaching students to use ChatGPT as an accompaniment to their studies is therefore a step away from the norms of standard academic practice.

Reason 4: Using ChatGPT is a waste of material resources

The technology industry has done a fantastic job of convincing us that chatbots and voice assistants inhabit a magical, immaterial realm and that there are no material resource implications to asking Alexa what the time is or asking ChatGPT to write an essay. But their ethereal presence hides the complex chain of material resources implicated in these transactions, concealing extractive mining and questionable labour practices. Large Language Models, like ChatGPT, are thirsty beasts, with fears that their growing use will stress water resources in an increasingly drought prone world. Encouraging large numbers of queries to AI chatbots when there is no good case for doing so is an unjustifiable waste of material resources.

So, all in all, I currently have no plans to incorporate ChatGPT into my teaching, at least when it comes to teaching substantive content, which in my case means subjects like theories of democracy. Of course, the caveat is if I was teaching a course where LLMs are relevant concern, then I would teach about them (but not through them). And a stronger case could be made for using them more actively if the aim is to teach students about potential digital tools for studying. However, given Reasons 2, 3 and 4, the appropriate thing to teach here may simply be to demonstrate it should not be used for academic writing. These are just some first thoughts. I am interested to hear others’ opinions and open to being convinced I am wrong.

1 May 2023