Skip to main content Skip to navigation

Toward Personalized XAI: A Case Study in Intelligent Tutoring Systems

Project Overview

The document explores the integration and assessment of an explanation feature within the Adaptive CSP (ACSP) applet, which serves as an Intelligent Tutoring System (ITS) designed to improve students' comprehension of constraint satisfaction problems through the use of AI-generated hints. It evaluates how these explanations influence learning outcomes, user perception, and trust in the system. The findings underscore the significance of tailoring explanations to align with individual user characteristics, such as personality traits and cognitive abilities. The results demonstrate that offering explanations not only enhances students' trust in the hints provided but also increases their perceived usefulness of the system. However, the impact of these explanations is not uniform and varies according to different user traits, suggesting that a personalized approach could optimize the educational experience and effectiveness of AI in learning environments.

Key Applications

Adaptive CSP (ACSP) applet

Context: Intelligent Tutoring System for university students learning constraint satisfaction problems

Implementation: The ACSP applet provides AI-driven hints and allows students to access explanations for these hints based on their interactions.

Outcomes: Increased trust in hints, improved perceived usefulness, and a higher intention to reuse the system. The explanations also provided insights into user behavior and learning gains.

Challenges: Variability in user engagement with explanations, complexity of the content, and different impacts based on individual user characteristics.

Implementation Barriers

User Engagement

Not all users accessed the explanation functionality, with some finding the hints clear enough or not wanting explanations. This lack of engagement may stem from the perceived value of explanations.

Proposed Solutions: Encouraging users to view explanations by enhancing their perceived value or simplifying the content.

Complexity of Explanations

Explanations were found to be complex and verbose, potentially overwhelming some users. This complexity may hinder user understanding and engagement.

Proposed Solutions: Designing more lightweight explanations, incorporating visuals, and providing tiered levels of detail.

Project Team

Cristina Conati

Researcher

Oswald Barral

Researcher

Vanessa Putnam

Researcher

Lea Rieger

Researcher

Contact Information

For information about the paper, please contact the authors.

Authors: Cristina Conati, Oswald Barral, Vanessa Putnam, Lea Rieger

Source Publication: View Original PaperLink opens in a new window

Project Contact: Dr. Jianhua Yang

LLM Model Version: gpt-4o-mini-2024-07-18

Analysis Provider: Openai

Let us know you agree to cookies