Skip to main content Skip to navigation

LLM-based Cognitive Models of Students with Misconceptions

Project Overview

The document explores the integration of generative AI in education, particularly through the development of Cognitive Student Models (CSMs) that leverage Large Language Models (LLMs) to better understand student cognition and misconceptions in algebra. A significant contribution is the creation of MalAlgoPy, a Python library designed to generate datasets reflecting common student errors in algebraic problem-solving. The research highlights the challenges of accurately modeling both misconceptions and correct problem-solving skills, advocating for meticulous calibration of training data to enhance educational AI applications. By addressing these challenges, the study aims to improve personalized learning experiences, ultimately leading to more effective educational tools that cater to individual student needs and foster better understanding in mathematics.

Key Applications

MalAlgoPy - Python library for generating datasets reflecting student solution patterns

Context: Educational technology for secondary school mathematics, specifically algebra

Implementation: LLMs are instruction-tuned using datasets of misconceptions and correct solutions; MalAlgoPy structures algebraic equations in a directed acyclic graph to simulate student problem-solving paths.

Outcomes: Effective simulation of student cognition and misconceptions; improved accuracy in adaptive testing and personalized instruction.

Challenges: Balancing the representation of misconceptions with correct problem-solving; diminishing correctness in problem-solving ability when focusing on misconceptions.

Implementation Barriers

Technical Barrier

Difficulty in accurately modeling student misconceptions while maintaining correct solution abilities.

Proposed Solutions: Calibrate training data ratios of correct to misconception examples; use a balanced approach to maintain both properties of CSMs.

Implementation Barrier

Limited generalizability of the models across different areas of mathematics and complexity levels.

Proposed Solutions: Expand the MalAlgoPy library to include more complex algebraic structures and broader mathematical concepts.

Project Team

Shashank Sonkar

Researcher

Xinghe Chen

Researcher

Naiming Liu

Researcher

Richard G. Baraniuk

Researcher

Mrinmaya Sachan

Researcher

Contact Information

For information about the paper, please contact the authors.

Authors: Shashank Sonkar, Xinghe Chen, Naiming Liu, Richard G. Baraniuk, Mrinmaya Sachan

Source Publication: View Original PaperLink opens in a new window

Project Contact: Dr. Jianhua Yang

LLM Model Version: gpt-4o-mini-2024-07-18

Analysis Provider: Openai

Let us know you agree to cookies