Convergence of Message Passing Graph Neural Networks with Generic Aggregation On Large Random Graphs - Archive ouverte HAL
Pré-Publication, Document De Travail Année : 2024

Convergence of Message Passing Graph Neural Networks with Generic Aggregation On Large Random Graphs

Résumé

We study the convergence of message passing graph neural networkson random graph models to their continuous counterpart as the number ofnodes tends to infinity. Until now, this convergence was only known forarchitectures with aggregation functions in the form of normalized means, or,equivalently, of an application of classical operators like the adjacency matrixor the graph Laplacian. We extend such results to a large class of aggregationfunctions, that encompasses all classically used message passing graph neuralnetworks, such as attention-based message passing, max convolutional messagepassing, (degree-normalized) convolutional message passing, or moment-based aggregation message passing. Under mildassumptions, we give non-asymptotic bounds with high probability to quantifythis convergence. Our main result is based on the McDiarmid inequality.Interestingly, this result does not apply to the case where the aggregation is acoordinate-wise maximum. We treat this case separately and obtain a differentconvergence rate.
Fichier principal
Vignette du fichier
main.pdf (823.35 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04059402 , version 1 (21-04-2023)
hal-04059402 , version 2 (14-07-2023)
hal-04059402 , version 3 (13-08-2024)

Identifiants

Citer

Matthieu Cordonnier, Nicolas Keriven, Nicolas Tremblay, Samuel Vaiter. Convergence of Message Passing Graph Neural Networks with Generic Aggregation On Large Random Graphs. 2024. ⟨hal-04059402v3⟩
203 Consultations
174 Téléchargements

Altmetric

Partager

More