Higher-order Sparse Convolutions in Graph Neural Networks - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2023

Higher-order Sparse Convolutions in Graph Neural Networks

Résumé

Graph Neural Networks (GNNs) have been applied to many problems in computer sciences. Capturing higher-order relationships between nodes is crucial to increase the expressive power of GNNs. However, existing methods to capture these relationships could be infeasible for large-scale graphs. In this work, we introduce a new higher-order sparse convolution based on the Sobolev norm of graph signals. Our Sparse Sobolev GNN (S-SobGNN) computes a cascade of filters on each layer with increasing Hadamard powers to get a more diverse set of functions, and then a linear combination layer weights the embeddings of each filter. We evaluate S-SobGNN in several applications of semi-supervised learning. S-SobGNN shows competitive performance in all applications as compared to several state-of-the-art methods.
Fichier principal
Vignette du fichier
2302.10505.pdf (690.89 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04368752 , version 1 (01-01-2024)

Identifiants

Citer

Jhony H. Giraldo, Sajid Javed, Arif Mahmood, Fragkiskos D. Malliaros, Thierry Bouwmans. Higher-order Sparse Convolutions in Graph Neural Networks. ICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech, and Signal Processing, Jun 2023, Rhodes Island, Greece. pp.1-5, ⟨10.1109/ICASSP49357.2023.10096494⟩. ⟨hal-04368752⟩
53 Consultations
14 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More