Knowledge Graph Refinement based on Triplet BERT-Networks - Archive ouverte HAL Access content directly
Other Publications Year : 2022

Knowledge Graph Refinement based on Triplet BERT-Networks


Knowledge graph embedding techniques are widely used for knowledge graph refinement tasks such as graph completion and triple classification. These techniques aim at embedding the entities and relations of a Knowledge Graph (KG) in a low dimensional continuous feature space. Unlike KG embedding methods and inspired by works built upon pre-trained language models, this paper adopts a transformer-based triplet network. It creates textual sequences from facts and fine-tunes a triplet network of pre-trained transformer-based language models, creating an embedding space that clusters the information about an entity or relation in the KG. It adheres to an evaluation paradigm that relies on efficient spatial semantic search technique. We show that this evaluation protocol is more adapted to a few-shot setting for the relation prediction task. Our proposed GilBERT method is evaluated on triplet classification and relation prediction tasks on multiple well-known benchmark knowledge graphs such as FB13, WN11, and FB15K. We show that GilBERT achieves better or comparable results to the state-of-the-art performance on these two refinement tasks.
Fichier principal
Vignette du fichier
DeepOntoNLP_ESWC_Knowledge GraphRefinement basedonTripletBERT-Networks.pdf (231.6 Ko) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-03689800 , version 1 (07-06-2022)


  • HAL Id : hal-03689800 , version 1


Armita Nassiri, Nathalie Pernelle, Fatiha Saïs, Gianluca Quercini. Knowledge Graph Refinement based on Triplet BERT-Networks. 2022. ⟨hal-03689800⟩
37 View
20 Download


Gmail Facebook Twitter LinkedIn More