Bayesian Learning-driven Prototypical Contrastive Loss for Class-Incremental Learning

Raichur NL, Heublein L, Feigl T, Ruegamer A, Mutschler C, Ott F (2026)


Publication Status: Submitted

Publication Type: Unpublished / Preprint

Future Publication Type: Journal article

Publication year: 2026

Publisher: arXiv

DOI: 10.48550/arXiv.2405.11067

Abstract

The primary objective of methods in continual learning is to learn tasks in a sequential manner over time (sometimes from a stream of data), while mitigating the detrimental phenomenon of catastrophic forgetting. This paper proposes a method to learn an effective representation between previous and newly encountered class prototypes. We propose a prototypical network with a Bayesian learning-driven contrastive loss (BLCL), tailored specifically for class-incremental learning scenarios. We introduce a contrastive loss that incorporates novel classes into the latent representation by reducing intra-class and increasing inter-class distance. Our approach dynamically adapts the balance between the cross-entropy and contrastive loss functions with a Bayesian learning technique. Experimental results conducted on the CIFAR-10, CIFAR-100, and ImageNet100 datasets for image classification and images of a GNSS-based dataset for interference classification validate the efficacy of our method, showcasing its superiority over existing state-of-the-art approaches.

Authors with CRIS profile

Involved external institutions

How to cite

APA:

Raichur, N.L., Heublein, L., Feigl, T., Ruegamer, A., Mutschler, C., & Ott, F. (2026). Bayesian Learning-driven Prototypical Contrastive Loss for Class-Incremental Learning. (Unpublished, Submitted).

MLA:

Raichur, Nisha Lakshmana, et al. Bayesian Learning-driven Prototypical Contrastive Loss for Class-Incremental Learning. Unpublished, Submitted. 2026.

BibTeX: Download