Skip to main content Skip to navigation

Personalizing Student-Agent Interactions Using Log-Contextualized Retrieval Augmented Generation (RAG)

Project Overview

The document explores the role of generative AI (GenAI) in education, particularly through the implementation of a pedagogical agent named Copa. Copa employs log-contextualized retrieval-augmented generation (LC-RAG) to enrich student interactions within collaborative computational modeling environments. The findings reveal that LC-RAG significantly enhances personalized support for critical thinking, facilitating deeper engagement for students in STEM+C learning contexts. Moreover, it addresses challenges associated with the integration of AI in education, such as the risks of irrelevant information and the necessity for effective semantic connections between student inputs and existing knowledge bases. Overall, the integration of Copa exemplifies how generative AI can foster improved educational experiences and outcomes by providing tailored assistance and promoting active participation in learning processes.

Key Applications

Log-contextualized retrieval-augmented generation (LC-RAG) with the agent Copa

Context: Collaborative computational modeling environment for high school students

Implementation: Implemented in a C2STEM learning environment where high school students engaged in a kinematics curriculum using Copa for support.

Outcomes: Enhanced retrieval of relevant knowledge, support for critical thinking, and students perceived interactions with Copa as epistemically valuable.

Challenges: Potential for irrelevant retrievals, students' frustration with not receiving direct answers, and the need for better alignment between student discourse and knowledge base.

Implementation Barriers

Technical Barrier

The need for a semantic link between student input and the knowledge base, which is often weak in collaborative dialogue.

Proposed Solutions: Integrating student interactions with environment log data to enhance retrieval accuracy and contextual relevance.

User Experience Barrier

Students expressed frustration when the agent did not provide direct answers or made incorrect suggestions.

Proposed Solutions: Encouraging critical thinking about agent suggestions and improving the training of the agent to enhance its responsiveness and accuracy.

Project Team

Clayton Cohn

Researcher

Surya Rayala

Researcher

Caitlin Snyder

Researcher

Joyce Fonteles

Researcher

Shruti Jain

Researcher

Naveeduddin Mohammed

Researcher

Umesh Timalsina

Researcher

Sarah K. Burriss

Researcher

Ashwin T S

Researcher

Namrata Srivastava

Researcher

Menton Deweese

Researcher

Angela Eeds

Researcher

Gautam Biswas

Researcher

Contact Information

For information about the paper, please contact the authors.

Authors: Clayton Cohn, Surya Rayala, Caitlin Snyder, Joyce Fonteles, Shruti Jain, Naveeduddin Mohammed, Umesh Timalsina, Sarah K. Burriss, Ashwin T S, Namrata Srivastava, Menton Deweese, Angela Eeds, Gautam Biswas

Source Publication: View Original PaperLink opens in a new window

Project Contact: Dr. Jianhua Yang

LLM Model Version: gpt-4o-mini-2024-07-18

Analysis Provider: Openai

Let us know you agree to cookies