Skip to main content

Conversational AI: A Case Study on AI-Assisted Learning in a Criminal Law Subject

Dr Armin Alimardani, University of Woolongong, A/Prof Emma Jane, University of NSW

Abstract

The advent of Generative Artificial Intelligence (GenAI) has sparked considerable interest in its potential to augment educational experiences. Despite the enthusiasm, the development of this transformative technology is dominated by STEM disciplines, which means many academics in HASS-related disciplines can only consider the promises and perils of AI in highly abstract terms. This gap is exacerbated by the proprietary nature of corporate- developed chatbots, which often lack transparency and raise concerns about data privacy and alignment with pedagogical values. This study addresses these concerns by presenting 'SmartTest', an AI chatbot that is designed by HASS academics. SmartTest prompts learners with open-ended questions, offers immediate feedback, and employs conversational nudges to stimulate the reevaluation of incorrect answers. In Spring 2023, SmartTest was piloted in Criminal Law and Procedure B at the School of Law, University of Wollongong, over a five- week period. A subsequent survey assessed student experiences with the chatbot.

Our findings indicate that GenAI's current capabilities are limited when addressing complex, structured problem questions, such as those requiring the ILAC framework, due to challenges in following multi-step instructions. Conversely, SmartTest demonstrates promise in facilitating learning through multiple short-answer questions, which include providing helpful hints when students get the answer wrong. Contrary to expectations, 'hallucination'—the generation of irrelevant or incorrect facts—was not problematic for SmartTest, but 'alignment'—the ability of GenAI to follow a prescribed instructional sequence—emerged as a significant challenge. The study also identifies secondary concerns, including the potential for students to exploit the chatbot for unintended purposes, such as prompting it to answer other class activity questions. The survey results revealed a general appreciation for SmartTest's role in guiding students toward correct answers, as well as a decrease in anxiety levels when compared to interactions with human instructors. Nonetheless, some students indicated a preference for receiving feedback from a human instructor, despite the potential delay of up to several days.

To promote the democratisation of emerging technologies across all academic disciplines, SmartTest has been made publicly available for educators to explore and adopt. *

  • Further insights into SmartTest's no-code user-friendly functionality will be discussed during the conference presentation.

Note: This study has been approved by the Social Sciences Human Research Ethics Committee of the University of Wollongong, Reference 2023/193.