Toward Label-Efficient Neural Network Training: Diversity-Based Sampling in Semi-Supervised Active Learning

Buchert F, Navab N, Kim ST (2023)


Publication Type: Journal article

Publication year: 2023

Journal

Book Volume: 11

Pages Range: 5193-5205

DOI: 10.1109/ACCESS.2023.3236529

Abstract

Collecting large-labeled data is an expensive and challenging issue for training deep neural networks. To address this issue, active learning is recently studied where the active learner provides informative samples for labeling. Diversity-based sampling algorithms are commonly used for representation-based active learning. In this paper, a new diversity-based sampling is introduced for semi-supervised active learning. To select more informative data at the initial stage, we devise a diversity-based initial dataset selection method by using self-supervised representation. We further propose a new active learning query strategy, which exploits both consistency and diversity. Comparative experiments show that the proposed method can outperform other active learning approaches on two public datasets.

Involved external institutions

How to cite

APA:

Buchert, F., Navab, N., & Kim, S.T. (2023). Toward Label-Efficient Neural Network Training: Diversity-Based Sampling in Semi-Supervised Active Learning. IEEE Access, 11, 5193-5205. https://doi.org/10.1109/ACCESS.2023.3236529

MLA:

Buchert, Felix, Nassir Navab, and Seong Tae Kim. "Toward Label-Efficient Neural Network Training: Diversity-Based Sampling in Semi-Supervised Active Learning." IEEE Access 11 (2023): 5193-5205.

BibTeX: Download