Embedding Learning with Triple Trustiness on Noisy Knowledge Graph - Sorbonne Université Accéder directement au contenu
Article Dans Une Revue Entropy Année : 2019

Embedding Learning with Triple Trustiness on Noisy Knowledge Graph

Résumé

Embedding learning on knowledge graphs (KGs) aims to encode all entities and relationships into a continuous vector space, which provides an effective and flexible method to implement downstream knowledge-driven artificial intelligence (AI) and natural language processing (NLP) tasks. Since KG construction usually involves automatic mechanisms with less human supervision, it inevitably brings in plenty of noises to KGs. However, most conventional KG embedding approaches inappropriately assume that all facts in existing KGs are completely correct and ignore noise issues, which brings about potentially serious errors. To address this issue, in this paper we propose a novel approach to learn embeddings with triple trustiness on KGs, which takes possible noises into consideration. Specifically, we calculate the trustiness value of triples according to the rich and relatively reliable information from large amounts of entity type instances and entity descriptions in KGs. In addition, we present a cross-entropy based loss function for model optimization. In experiments, we evaluate our models on KG noise detection, KG completion and classification. Through extensive experiments on three datasets, we demonstrate that our proposed model can learn better embeddings than all baselines on noisy KGs.
Fichier principal
Vignette du fichier
entropy-21-01083-v2.pdf (985.35 Ko) Télécharger le fichier
Origine : Publication financée par une institution
Loading...

Dates et versions

hal-02430754 , version 1 (07-01-2020)

Identifiants

Citer

Yu Zhao, Huali Feng, Patrick Gallinari. Embedding Learning with Triple Trustiness on Noisy Knowledge Graph. Entropy, 2019, 21 (11), pp.1083. ⟨10.3390/e21111083⟩. ⟨hal-02430754⟩
62 Consultations
67 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More