Skip to main content Skip to navigation

AI Meets the Classroom: When Do Large Language Models Harm Learning?

Project Overview

The document examines the role of generative AI, particularly Large Language Models (LLMs), in education, highlighting their mixed impact on learning outcomes. While LLMs can expand the range of topics students engage with, they may inadvertently hinder comprehension, especially for those with limited prior knowledge. Students often rely on LLMs to replace traditional learning activities rather than enhance them, which can detract from overall educational effectiveness. Empirical findings indicate that the interaction between students and LLMs significantly influences performance, with factors such as gender, study level, and prior experience playing crucial roles. Additionally, the document underscores challenges related to the integration of LLMs in educational contexts, including issues of engagement and the potential for misuse, emphasizing the need for careful consideration of usage behavior and contextual factors to optimize the benefits of LLMs in learning environments. Overall, while LLMs hold promise for enhancing educational experiences, their implementation requires strategic approaches to ensure they serve as effective learning tools rather than substitutes for genuine understanding.

Key Applications

LLM-based Learning Support and Tutoring

Context: University-level education, specifically in graduate programming courses and other disciplines, where students have access to LLMs for assistance in problem-solving and understanding concepts during learning phases.

Implementation: Students interacted with Large Language Models (LLMs) during coding exercises and other learning tasks. They were provided access to LLMs with varying conditions, including the ability to copy and paste code. The implementation involved laboratory experiments and field studies assessing the impact of LLMs on learning outcomes, coding exercises, and overall educational engagement.

Outcomes: Findings indicated that LLM access increased the volume of topics covered, with some students improving their performance and understanding in applying concepts. However, there were also indications of decreased topic understanding and shallow learning, particularly when students relied on LLMs for solutions instead of attempting problems independently.

Challenges: Challenges included variations in effectiveness based on factors such as gender and prior experience with LLMs, as well as issues with engagement and misuse, particularly the tendency to copy and paste instead of actively learning from interactions.

Implementation Barriers

Technical Barrier

The inability to copy and paste when using LLMs increases transaction costs and reduces their effectiveness in learning.

Proposed Solutions: Enabling copy and paste functionality to facilitate smoother interactions with LLMs.

Behavioral/Misuse Barrier

Students exhibit a tendency to substitute LLM use for their own learning activities, which can harm their understanding. Additionally, they might misuse LLMs by copying and pasting answers rather than engaging with the material.

Proposed Solutions: Encouraging students to use LLMs as complementary tools rather than substitutes for direct engagement with learning materials, and implementing guidelines and monitoring use to promote constructive interaction with LLMs.

Engagement Barrier

Students may not engage with LLMs effectively, leading to suboptimal learning outcomes.

Proposed Solutions: Encouraging proper usage and integrating LLMs into structured learning activities.

Project Team

Matthias Lehmann

Researcher

Philipp B. Cornelius

Researcher

Fabian J. Sting

Researcher

Contact Information

For information about the paper, please contact the authors.

Authors: Matthias Lehmann, Philipp B. Cornelius, Fabian J. Sting

Source Publication: View Original PaperLink opens in a new window

Project Contact: Dr. Jianhua Yang

LLM Model Version: gpt-4o-mini-2024-07-18

Analysis Provider: Openai

Let us know you agree to cookies