Skip to main content Skip to navigation

Explainability via Interactivity? Supporting Nonexperts' Sensemaking of Pretrained CNN by Interacting with Their Daily Surroundings

Project Overview

The document explores the integration of generative AI in education through a mobile application aimed at helping non-expert users, especially design students, grasp the concepts of Convolutional Neural Networks (CNN). By employing Explainable AI (XAI) techniques, such as Class Activation Maps, the application provides an interactive platform for users to visualize and understand the decision-making processes of CNNs. This innovative approach not only engages students in active learning but also enhances their comprehension of AI's capabilities and limitations. The deployment of this tool in a university course demonstrated noteworthy positive outcomes, significantly improving students' learning experiences and deepening their understanding of artificial intelligence. Through its focus on interactivity and visualization, the application exemplifies the potential benefits of generative AI in educational contexts, fostering a more accessible and enriching learning environment for students.

Key Applications

Mobile application that uses Class Activation Maps to visualize CNN decisions

Context: University course module 'prototyping with AI' for 30 design students with little to no prior experience in AI

Implementation: The application was integrated into a university course where students were instructed via an online video lecture and encouraged to interact with their daily surroundings using the tool.

Outcomes: Students reported positive experiences, finding the tool fun and informative. They gained practical understandings of CNNs and engaged in sensemaking activities that enhanced their comprehension of AI models.

Challenges: Limited prior knowledge among students about AI concepts may hinder initial understanding; ensuring the tool is intuitive for non-experts is critical.

Implementation Barriers

Knowledge Barrier

Non-expert users often have limited understanding of AI technologies, making it difficult for them to grasp complex models like CNNs. The complexity of CNN models can also be a barrier to effective use and understanding for non-expert users.

Proposed Solutions: Develop interactive and intuitive tools that engage users in hands-on learning experiences to enhance their understanding. Additionally, utilize visualization techniques such as Class Activation Maps to simplify and clarify the decision-making process of CNNs.

Project Team

Chao Wang

Researcher

Pengcheng An

Researcher

Contact Information

For information about the paper, please contact the authors.

Authors: Chao Wang, Pengcheng An

Source Publication: View Original PaperLink opens in a new window

Project Contact: Dr. Jianhua Yang

LLM Model Version: gpt-4o-mini-2024-07-18

Analysis Provider: Openai

Let us know you agree to cookies