Adapting without forgetting: KnowBert-UMLS - CEA - Commissariat à l’énergie atomique et aux énergies alternatives Accéder directement au contenu
Communication Dans Un Congrès Année : 2022

Adapting without forgetting: KnowBert-UMLS

Résumé

Domain adaptation in pretrained language models usually comes at some cost, most notably out-of-domain performance. This type of specialization typically relies on pre-training over a large in-domain corpus, which has the side effect of causing catastrophic forgetting on general text. We seek to specialize a language model by incorporating information from a knowledge base into its contextualized representations, thus reducing its reliance on specialized text. We achieve this by following the KnowBert method, applied to the UMLS biomedical knowledge base. We evaluate our model on in-domain and out-of-domain tasks, comparing against BERT and other specialized models. We find that our performance on biomedical tasks is competitive with the state-of-the-art with virtually no loss of generality. Our results demonstrate the applicability of this knowledge integration technique to the biomedical domain as well as its shortcomings. The reduced risk of catastrophic forgetting displayed by this approach to domain adaptation broadens the scope of applicability of specialized language models.
Fichier principal
Vignette du fichier
Adapting_without_forgetting_Guilhem_PIAT_eHPWAS-2022.pdf (371.69 Ko) Télécharger le fichier
Origine : Fichiers éditeurs autorisés sur une archive ouverte

Dates et versions

cea-04513766 , version 1 (20-03-2024)

Identifiants

Citer

Guilhem Piat, Alexandre Allauzen, Hassane Essafi, Gaël Bernard, Julien Tourille, et al.. Adapting without forgetting: KnowBert-UMLS. WiMob 2022 - 18th International Conference on Wireless and Mobile Computing, Networking and Communications, Oct 2022, Thessalonique, Greece. pp.19-24, ⟨10.1109/WiMob55322.2022⟩. ⟨cea-04513766⟩
1 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More