- By Hazel Shearing and Shiona McCallum
- Education correspondent and technology reporter
image source, Getty Images
With exam season approaching, students may be tempted to turn to new Artificial Intelligence (AI) tools to give them an edge on assessments.
Universities have struggled to understand what AI applications like ChatGPT are capable of and introduce guidance on how they can be used – and now they are being urged to teach students how to use them.
University of Bath academics considered the challenges and opportunities.
“Our first question was, ‘Could this be used by students to answer our assessment questions?’” says James Fern of ChatGPT, an online tool that can answer questions, including producing essays and e- mail, in human-like language.
James checked how strong his department’s ratings are
“Multiple-choice questions, for example, it will handle these very well.
“We definitely didn’t expect it to go so well…it was getting close to 100% correct.”
But with more complex questions, which require students to think critically and which he says make up the bulk of the assessment, he struggles.
One example, taken from a final year evaluation, reads: “Why is it important to understand the timing of exercise in relation to nutritional status in people with [a technical term, according to James] overweight?”
And there are telltale signs that the answer given by ChatGPT was not written by a student.
“At first glance, it looks very nice – it looks very clearly written, it looks quite professional in its own language,” James says.
But some statements are more like those of a GCSE student than those of an undergraduate.
He has a habit of repeating the exact wording of the question in his introductions and conclusions, “just written a little differently.”
And when sources of information are cited, as is the norm in academic work, they are simply made up.
“They look perfect — they have the right author names, they have the right journal names, the titles all make great sense — they just don’t exist,” says James.
“If you’re unaware of how large language models work, you’ll easily be fooled into thinking that these are authentic references.”
Since ChatGPT was released to the public about six months ago, many students have not been sure when they can and cannot use it.
“I might be tempted to use ChatGPT…but right now I’m too scared you might get caught,” says a student walking between classes on campus.
“It’s still unclear what counts as cheating with ChatGPT,” says another. “If you copied the entire task from ChatGPT it’s cheating, but it can be really helpful to guide you,”
She encourages them to explain to new and returning students in September how and when to use AI and to adapt coursework where appropriate.
Marketing lecturer Kim Watts calls it “another tool in the toolbox.” And some students in his department have already started using ChatGPT with this term, in courses that require them to produce a marketing plan.
Kim says ChatGPT will help get students ‘started’
“I suggest students go to ChatGPT, those who maybe don’t know where to start… and start playing with hints,” he says.
“It won’t give them answers, but it can give them ideas.”
‘critical thinking’
Kim demonstrates by asking ChatGPT to produce their marketing plan.
It responds with a series of numbered bullet points—everything from building a brand identity to using social media.
But Kim, looking up from her screen, says, “This won’t go away.
“Sending something like that isn’t detailed enough. It doesn’t show us any learning, it doesn’t show any critical thinking.”
Neurodivergent students and those for whom English is not their first language will benefit the most from ChatGPT, Kim says.
But any student who chooses to use it will be prompted to submit their ChatGPT messages and responses as an appendix, to make “really clear how far they’ve come” from the chatbot responses.
Summer exams
As with most universities, Bath’s policy on ChatGPT and other AI tools is still a work in progress. It should be operational from September.
Thereafter, a team will meet throughout the year to make sure they stay ahead of the rapidly changing technology.
Meanwhile, many of the staff are once again setting up in-person, supervised summer exams.
Dr Chris Bonfield, who leads a team that helps design assessments, says the “default assumption” is that students shouldn’t be using ChatGPT this year. And, if staff decide to allow it, they should set their expectations clearly.
The pace at which the technology is evolving presents a challenge for universities, but Bath has quickly moved away from conversations about banning it.
Chris says the pace at which technology changes presents a challenge for universities
“This tool is not going away,” says Chris.
“To ensure that our students are equipped with the skills they need for their future workplace, but also that our degrees stay up-to-date, we will have to work hard.”
‘Too dangerous’
Professor Verena Rieser, a computer scientist at Heriot-Watt University who has been working in the field of artificial intelligence for two decades, says her own students are “using it in very creative ways,” but chatbots are still in the early stages of development. development and ” can be used to generate disinformation a [a] scale which is obviously very worrying” when it comes to education.
Older models of ChatGPT weren’t released because they were deemed “too dangerous,” he says.
Its developer, OpenAI, says that “like any technology, these tools carry real risks” and works “to ensure that security is built into our system at every level.”
“I expect that soon we will see different types of ChatGPT from different companies out there and hopefully also more secure models that actually mitigate the possible dangers,” Verena says.
“Right now we really don’t know how to stop models from giving wrong, toxic or hateful information, and that’s a big deal.”
#pass #degree #ChatGPT #BBC #News