Share on facebook
Share on twitter
Share on linkedin

Start with three rules for AI and University

Three rules for AI and University

This morning, I found myself chatting to Bridget and Francesca, two postgraduate students at my local coffee shop about AI. One told me that half her teachers tell her to use it and the other half prohibit it. AI is not going to disappear any more than internet searching has. In fact the two are now intimately connected with AI being used by Google. What should be the rules?

Let us begin with the easy stuff. To ban a student from using AI to gain a grasp of a subject to which they are otherwise blocked is silly. It is a resource like a textbook. Both have elementary mistakes in them; each can help deliver a baseline understanding.

  1. Where, in a written assignment, be it a class essay or a doctorate, students rely on an AI finding, they should footnote the source and with AI, refer to the question posed to the programme and the software used.

Students should always be encouraged or required to take any references found on AI, check them and refer to the actual sources. A reference to something else cannot be based on ChatGPT or a google search. The author has to have read the original. If not, the correct citation would be “X as discussed or mentioned by ChatGPT when asked… ”.

Mr Justice Fordham had to be right to condemn the use of fake case citations in Bandla v Solicitors Regulation Authority [2025] EWHC 1167 (Admin) https://www.bailii.org/cgi-bin/format.cgi?doc=/ew/cases/EWHC/Admin/2025/1167.html, regardless of where the author had originally found them. Mr Justice Ritchie did something similar in the recent Ayinde case: https://www.bailii.org/cgi-bin/format.cgi?doc=/ew/cases/EWHC/Admin/2025/1040.html.

One can easily blame AI. That is the wrong answer.

  1. The responsibility for citation rests with the person doing the citing to check the original. Misleading a court, a public authority, a research community and an examiner in this way lands at the feet of the person doing it.

Stories abound about fake materials generated by AI. The same, though, can be true of reputable practitioner textbooks. A story did the rounds in the 1980s of an early revival of practitioner book done by young lawyers officially under an editor who never read the manuscript. The authors invented some old cases for the footnotes to show this up!

One of the joys of AI currently can be that the errors in it expose its use. So, when Universities prohibit its use, they need to set assignments that expose this problem. A problem-based assignment against the clock under exam conditions, even while allowing reference to the internet, should reduce the issue. Actually, if AI produces the right answer and the candidate gives it, that does relatively little harm.

Setting a time-unlimited assignment particularly of the “reflect on this module” type invites AI use which represents one of many reasons not to require this type of work (the other is that the results will often be uninteresting and good students will struggle to escape mediocrity).

  1. Teachers have a responsibility to try to show students how best to learn. That might involve using AI to break down blockages. At present, it does not represent the best way to develop profound knowledge or help people learn more.

That may change or may already have changed. Teaching students how to learn has never been easy. I routinely tell comparative international arbitration students not to use general textbooks because the subject is much too big to write a general book properly. Students need to follow advice when it comes from a reputable source or at least test it. In fact, testing and exposing bad sources of all types is part of the post-graduate study process. Post-graduate teachers have to show their students how to learn, by doing laboratory or empirical research (but knowing their risks and limitations), studying other papers and source material and the like.

Get in touch

Contact Adam by e-mail, phone or post at:

E-Mail: adamsamuel@aol.com or adamsamueltc@yahoo.com

Mobile: 07900 248150

The Attic, 117 Priory Road, London NW6 3NN.