Generalized pseudo-labeling in consistency regularization for semi-supervised learning - CEA - Commissariat à l’énergie atomique et aux énergies alternatives Accéder directement au contenu
Communication Dans Un Congrès Année : 2023

Generalized pseudo-labeling in consistency regularization for semi-supervised learning

Résumé

Semi-Supervised Learning (SSL) reduces annotation cost by exploiting large amounts of unlabeled data. A popular idea in SSL image classification is Pseudo-Labeling (PL), where the predictions of a network are used in order to assign a label to an unlabeled image. However, this practice exposes learning to confirmation bias. In this paper we propose Generalized Pseudo-Labeling (GPL), a simple and generic way to exploit negative pseudo-labels in consistency regularization, entailing minimal additional computational overhead and hyperpameter fine-tuning. GPL makes learning more robust by using the information that an image does not belong to a certain class, which is more abundant and reliable. We showcase GPL in the context of FixMatch. In the benchmark using only 40 labels of the CIFAR-10 dataset, adding GPL on top of Fix-Match improves the error rate from 7.93% to 6.58%, and on CIFAR-100 with 2500 labels, from 28.02% to 26.85%.
Fichier principal
Vignette du fichier
GPL_ICIP_NoteIEEE.pdf (1.03 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

cea-04503203 , version 1 (13-03-2024)

Identifiants

Citer

Nikolaos Karaliolios, Florian Chabot, Camille Dupont, Hervé Le Borgne, Quoc-Cuong Pham, et al.. Generalized pseudo-labeling in consistency regularization for semi-supervised learning. ICIP 2023 - 2023 IEEE International Conference on Image Processing, Oct 2023, Kuala Lumpur, Malaysia. pp.525-529, ⟨10.1109/ICIP49359.2023.10221965⟩. ⟨cea-04503203⟩
6 Consultations
18 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More