Convergence of Message Passing Graph Neural Networks with Generic Aggregation On Large Random Graphs - Archive ouverte HAL
Pré-Publication, Document De Travail Année : 2023

Convergence of Message Passing Graph Neural Networks with Generic Aggregation On Large Random Graphs

Résumé

We study the convergence of message passing graph neural networks on random graph models to their continuous counterpart as the number of nodes tends to infinity. Until now, this convergence was only known for architectures with aggregation functions in the form of degree-normalized means. We extend such results to a very large class of aggregation functions, that encompasses all classically used message passing graph neural networks, such as attention-based mesage passing or max convolutional message passing on top of (degree-normalized) convolutional message passing. Under mild assumptions, we give non asymptotic bounds with high probability to quantify this convergence. Our main result is based on the McDiarmid inequality. Interestingly, we treat the case where the aggregation is a coordinate-wise maximum separately, at it necessitates a very different proof technique and yields a qualitatively different convergence rate.
Fichier principal
Vignette du fichier
main (1).pdf (520.35 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04059402 , version 1 (21-04-2023)
hal-04059402 , version 2 (14-07-2023)
hal-04059402 , version 3 (13-08-2024)

Identifiants

  • HAL Id : hal-04059402 , version 1

Citer

Matthieu Cordonnier, Nicolas Keriven, Nicolas Tremblay, Samuel Vaiter. Convergence of Message Passing Graph Neural Networks with Generic Aggregation On Large Random Graphs. 2023. ⟨hal-04059402v1⟩
203 Consultations
174 Téléchargements

Partager

More