Topic-aware latent models for representation learning on networks - Archive ouverte HAL
Article Dans Une Revue Pattern Recognition Letters Année : 2021

Topic-aware latent models for representation learning on networks

Résumé

Network representation learning (NRL) methods have received significant attention over the last years thanks to their success in several graph analysis problems, including node classification, link prediction and clustering. Such methods aim to map each vertex of the network into a low dimensional space in a way that the structural information of the network is preserved. Of particular interest are methods based on random walks; such methods transform the network into a collection of node sequences, aiming to learn node representations by predicting the context of each node within the sequence. In this paper, we introduce TNE, a generic framework to enhance the embeddings of nodes acquired by means of random walk-based approaches with topic-based information. Similar to the concept of topical word embeddings in Natural Language Processing, the proposed model first assigns each node to a latent community with the favor of various statistical graph models and community detection methods, and then learns the enhanced topic-aware representations. We evaluate our methodology in two downstream tasks: node classification and link prediction. The experimental results demonstrate that by incorporating node and community embeddings, we are able to outperform widely-known baseline NRL models.
Fichier principal
Vignette du fichier
TNE_Pattern_Recognition_Letters.pdf (566.13 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03420690 , version 1 (09-11-2021)
hal-03420690 , version 2 (10-11-2021)

Identifiants

  • HAL Id : hal-03420690 , version 1

Citer

Abdulkadir Çelikkanat, Fragkiskos D Malliaros. Topic-aware latent models for representation learning on networks. Pattern Recognition Letters, 2021. ⟨hal-03420690v1⟩
128 Consultations
109 Téléchargements

Partager

More