A Multilingual Sentiment Lexicon for Low-Resource Language Translation using Large Languages Models and Explainable AI
Project Overview
The document explores the role of generative AI in education, focusing on its application in multilingual sentiment analysis and translation, particularly for low-resource languages in South Africa and the Democratic Republic of Congo. It underscores the critical need for accurate sentiment classification and translation tools to support underrepresented languages in educational contexts. Key findings reveal the effectiveness of machine learning models, especially BERT, in performing sentiment analysis, which is essential for understanding student feedback and improving learning experiences. Additionally, the document emphasizes the importance of explainable AI (XAI) techniques to enhance model transparency and foster trust among educators and students. Overall, the integration of generative AI in education presents promising solutions to language barriers and enhances accessibility and inclusivity in learning environments.
Key Applications
Multilingual Lexicon for Sentiment Analysis and Translation
Context: Education, government, and business in South Africa and the DRC, targeting multilingual communities.
Implementation: Creation of a lexicon with sentiment scores for multiple languages, integrated with machine learning models for sentiment classification.
Outcomes: Improved sentiment analysis accuracy and translation quality for low-resource languages, better capturing cultural nuances.
Challenges: Limited labeled data for low-resource languages, varying sentiment expressions across cultures.
Implementation Barriers
Data scarcity and model limitations
Insufficient labeled data for low-resource languages hinders the development of accurate AI systems, and existing machine learning models struggle to accurately handle the complexities of these languages.
Proposed Solutions: Development of lexicons and datasets specifically for languages like Zulu, Sepedi, and Ciluba to support AI applications, along with utilizing advanced models like BERT and integrating explainable AI techniques to enhance model understanding and performance.
Project Team
Melusi Malinga
Researcher
Isaac Lupanda
Researcher
Mike Wa Nkongolo
Researcher
Phil van Deventer
Researcher
Contact Information
For information about the paper, please contact the authors.
Authors: Melusi Malinga, Isaac Lupanda, Mike Wa Nkongolo, Phil van Deventer
Source Publication: View Original PaperLink opens in a new window
Project Contact: Dr. Jianhua Yang
LLM Model Version: gpt-4o-mini-2024-07-18
Analysis Provider: Openai