Knowledge Graph Refinement based on Triplet BERT-Networks - Laboratoire Interdisciplinaire des Sciences du Numérique Accéder directement au contenu
Autre Publication Scientifique Année : 2022

Knowledge Graph Refinement based on Triplet BERT-Networks

Résumé

Knowledge graph embedding techniques are widely used for knowledge graph refinement tasks such as graph completion and triple classification. These techniques aim at embedding the entities and relations of a Knowledge Graph (KG) in a low dimensional continuous feature space. Unlike KG embedding methods and inspired by works built upon pre-trained language models, this paper adopts a transformer-based triplet network. It creates textual sequences from facts and fine-tunes a triplet network of pre-trained transformer-based language models, creating an embedding space that clusters the information about an entity or relation in the KG. It adheres to an evaluation paradigm that relies on efficient spatial semantic search technique. We show that this evaluation protocol is more adapted to a few-shot setting for the relation prediction task. Our proposed GilBERT method is evaluated on triplet classification and relation prediction tasks on multiple well-known benchmark knowledge graphs such as FB13, WN11, and FB15K. We show that GilBERT achieves better or comparable results to the state-of-the-art performance on these two refinement tasks.
Fichier principal
Vignette du fichier
DeepOntoNLP_ESWC_Knowledge GraphRefinement basedonTripletBERT-Networks.pdf (231.6 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03689800 , version 1 (07-06-2022)

Identifiants

  • HAL Id : hal-03689800 , version 1

Citer

Armita Nassiri, Nathalie Pernelle, Fatiha Saïs, Gianluca Quercini. Knowledge Graph Refinement based on Triplet BERT-Networks. 2022. ⟨hal-03689800⟩
70 Consultations
47 Téléchargements

Partager

Gmail Facebook X LinkedIn More