Paper ID | SPTM-11.3 |
Paper Title |
GRAPHON AND GRAPH NEURAL NETWORK STABILITY |
Authors |
Luana Ruiz, Zhiyang Wang, Alejandro Ribeiro, University of Pennsylvania, United States |
Session | SPTM-11: Graphs Neural Networks |
Location | Gather.Town |
Session Time: | Wednesday, 09 June, 16:30 - 17:15 |
Presentation Time: | Wednesday, 09 June, 16:30 - 17:15 |
Presentation |
Poster
|
Topic |
Signal Processing Theory and Methods: [SIPG] Signal and Information Processing over Graphs |
IEEE Xplore Open Preview |
Click here to view in IEEE Xplore |
Virtual Presentation |
Click here to watch in the Virtual Conference |
Abstract |
Graph neural networks (GNNs) are learning architectures that rely on knowledge of the graph structure to generate meaningful representations of large-scale network data. GNN stability is thus important as in real-world scenarios there are typically uncertainties associated with the graph. We analyze GNN stability using kernel objects called graphons. Graphons are both limits of convergent graph sequences and generating models for deterministic and stochastic graphs. Building upon the theory of graphon signal processing, we define graphon neural networks and analyze their stability to graphon perturbations. We then extend this analysis by interpreting the graphon neural network as a generating model for GNNs on deterministic and stochastic graphs instantiated from the original and perturbed graphons. We observe that GNNs are Lipschitz stable to graphon perturbations with a stability constant that decreases asymptotically with the size of the graph. This asymptotic behavior is further demonstrated in an experiment of movie recommendation. |