This talk presents a retrieval-augmented generation system tailored for emotion analysis in Nigerian Afrobeats lyrics. Integrating supplementary dictionaries with a hierarchical classifier can enable the system retrieve culturally relevant context to refine emotion predictions. We discuss the architecture, implementation challenges and potential for addressing code-mixing and linguistic nuances in low-resource settings.
AI, LLMS, Music
In this talk, we introduce a novel retrieval-augmented generation (RAG) system designed to enhance emotion analysis in Nigerian Afrobeats lyrics. Our approach combines supplementary linguistic resources including Igbo-English, Yoruba-English, Pidgin-English, and English-Hausa dictionaries, with a hierarchical classifier. The system first uses a root-level classifier to assign a meta-emotion and then retrieves contextually relevant passages from a combined corpus of lyrics and dictionary entries. This retrieved context is appended to the original input, allowing the branch-level classifier to make more informed fine-grained predictions.
We will cover the system’s architecture, the process of building an enhanced retrieval corpus, and the integration of retrieval with hierarchical classification. The talk also discusses challenges we encountered such as data imbalance, error propagation from the root classifier, and ambiguity in dynamic emotion expressions, and outlines future work directions, including ensemble methods and explainability tools to further improve robustness and transparency. This work shows that using culturally informed resources can significantly improve emotion recognition in code-mixed, low-resource settings.
Oyinkansola Onwuchekwa (KKC) is a PhD researcher in Artificial Intelligence and Data Science at the University of Hull, specialising in the intersection of music and emotion through deep learning. Her thesis, Cross-Cultural Analysis of Music Genres and Emotions Using Deep Learning, investigates how emotions are conveyed in Afrobeats and Western music lyrics, with a particular focus on low-resource languages such as Yoruba, Igbo, Hausa, and Nigerian Pidgin. As a musician and writer performing under the name KKC, she brings a creative perspective to her technical research.
Alongside her doctoral work, Oyinkansola is a Research Assistant on a British Academy-funded project under the ODA Challenge-Oriented Research Grants Programme 2024, which empowers rural artist communities in Burkina Faso through digital and AI literacy. Her contributions include designing innovative toolkits for understanding AI technologies, training generative AI models on cultural data, and supporting ethical data governance. An active mentor, she has supervised a Nuffield Research Placement project and founded a tech help group that supports over 200 aspiring data scientists worldwide. Recognised with awards such as the AdvanceHE Associate Fellowship and a DAIM Certificate of Recognition, Oyinkansola is committed to advancing inclusive AI that addresses the challenges of processing African languages and minimising bias in global language technologies.