Updated: Oct 20
For those of you who were also obsessed with the YouTube channel Kiki&Tom before coming here, you may recall certain harrowing episodes centred around exam revision and the diet itself. It was at these moments I recall 17-year-old Liv becoming humbly aware that socialising is not my only commitment at this university and that I am, in fact, going to have to sit in a scary exam hall and recall something I learnt. However, in December 2020, this was not the reality with which I was met. Instead, heavily weighted coursework and online, typed exams have replaced those hallowed halls for the past three years. Opinions on these massively vary, but regardless of your opinion, schools may be taking a step towards in-person handwritten exams. This comes as a result of a growth in generative AI services, such as ChatGPT, potentially threatening academic integrity.
The theory is that students, despite the threat of academic misconduct, can ask something like ChatGPT an essay question during their online exam and then simply paste it onto their exam paper. This has struck many questions in the minds of university authorities across the world with responses ranging across a massive spectrum. From New York City schools banning it from all devices and networks to Dr Thomas Lancaster, Imperial College, promoting oral assessments and Michael Drapper, Swansea University, encouraging students to practice with it as they’ll use it in the workplace. Perhaps influenced by Sam Altman, OpenAI CEO, offering that “we invented calculators and changed what we tested for in math class”. A seemingly “it’s your problem” response.
An alternative reaction to finding new modes of assessment is simply to return to the old ones. Cue the traditional loved or loathed sit-down invigilated exams? Obviously, this is problematic for a generation of students, many of whom have not sat these exams in four years. Additionally, AI could still be used in essay-based coursework so what’s the point? Unless we are suggesting a catapult back to 100% weighted exams. That would be quite a controversial choice but is exactly what many Australian universities are making moves towards.
I chatted with some professors in the School of History to see their opinions on the presence of ChatGPT within their school. Alison Beach, professor of Medieval History, vocalised her concern for the use of ChatGPT in online exams emphasising the need for clarification on exactly what counts as academic misconduct. She does not think these AI services pose a threat to essay-based coursework as the technology cannot produce a decent academic essay… yet. With regards to returning to in-person, written exams, Professor Beach believes her school would be better considering what exactly they are testing students on. Recalling facts? Then those kinds of exams are great. However, if they test critical thinking and other impressive academic skills then the fear of a computer emulating a student’s response falls drastically.
John Hudson, Professor of Medieval History, expressed that he is not terribly bothered from the point of view of undetectable student misconduct. If students utilise these generative AI services at present, he equates it to a Wikipedia search in the early days of Wikipedia. Use is strikingly obvious due to referencing issues and the overall quality of the work if a student has directly produced material in this way, particularly in honours where modules become increasingly more specific. The problems lie in sub-honours where there is a danger that students may scrape a pass where they otherwise wouldn’t. However, he emphasises that constant technological developments mean the services will improve and then potentially pose a real threat. Looking forward, he believes that the University ought to reintroduce sit-down, invigilated exams in some form for sub-honours students. However, this poses its own set of issues, from handwriting to stress on students.
So what are the current plans? The School of Philosophy commented that they are “beginning to work with students to develop new assessments that are resistant to cheating using ChatGPT and similar tools.” They are looking into a variety of options saying “We always try to design assessments that enhance learning. Exams might be appropriate in some cases, but I don’t anticipate returning to exams across the board: many of us doubt that they are the best way to promote the kind of deep, critical, creative thinking that we aim to teach.” Derek Patrick, Academic Support Officer for the School of History commented that the school are very aware of these types of AI and that they have been the subject of debate at various committee meetings over the last six to twelve months. The school are encouraging different types of assessment but may consider a trial for small sub-honours classes. He concluded by stating awareness of the impact that any changes would have on students thus “even minor alterations will be discussed at the staff-student council so the student representatives get the chance to comment on any potential knock-on effect.”
I then spoke to Gerald Prescott, Associate Dean of Education (Science) and Shiona Chillas, the Associate Dean of Students (Arts and Divinity) to see where the University policy is headed on the overall use of AI. They informed me that the new Good Academic Practice (GAP) policy has been updated to define Unauthorised use of AI as a form of misconduct. The policy reads: “Unauthorised use of Artificial Intelligence (AI) is when a student presents the output of an artificial intelligence technology, such as a large language model (LLM) or paraphrasing application, as their own work. This does not apply to assessments which specifically permit or encourage the use of such tools.” They are currently conducting additional work to further clarify how Generative AI may or may not be used for students and staff.
I asked what the process is behind passing the GAP policy and was assured that it involves a wide consultation among students and staff, including school presidents, Directors of Teaching, the Learning and Teaching Committee, and the Undergraduate Academic Forum.
Interestingly, the University’s current policy is not to utilise Turnitin’s chatbot detector. I’ve been told by the Associate Deans of Education that the reason is largely because they are not convinced of its detection abilities and why start an “arms race” when we have a university full of expert staff that we can safely trust to know their subject and its sources.
With regard to the exam situation, some schools have been permitted to return to in-person exams this academic year. However, it is treated as a temporary response while the University takes more time to reflect on the impacts of AI and decide on a long-term plan.
Most schools are looking towards other changes to assessment. Potential ideas could be changing the writing context of an essay question, more emphasis on reflection and critical thinking, a requirement to submit an annotated source as part of a source analysis, and many more. But couldn’t a chatbot answer these questions? Well, not yet. Chatbots answer a question by collating the most plausible response based on a massive database of sources, however, its sources may not be accurate. So it seems the academic essay will live another day!
My interview with the Associate Deans of Education and Students concluded with a very true, and, if I may say, encouraging, point; it is unwise to think of this technology in an “us vs them” notion — it will have many benefits, but of course, limitations too. We must learn to use it responsibly and ethically. After all, the University of St Andrews’ education has evolved over 600 years, responding to countless disruptions, what’s one more?
From my time exploring this topic, I can safely assure you of two points. Firstly, writing for Science & Technology as a medieval history student proved a tad more testing than I originally anticipated. But second, and most importantly, you will not suddenly have the weight of all your modules depending on how fast you can write. Instead, we should remain positive that the University has faith in the ethical morals of its students and get excited about the benefits that developments in technology can bring. Even if that means more class presentations.
Illustration by Olivia Jones