Homophily modulates double descent generalization in graph convolution networks
成果类型:
Article
署名作者:
Shi, Cheng; Pan, Liming; Hu, Hong; Dokmanic, Ivan
署名单位:
University of Basel; Chinese Academy of Sciences; University of Science & Technology of China, CAS; Nanjing Normal University; University of Pennsylvania; University of Illinois System; University of Illinois Urbana-Champaign
刊物名称:
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA
ISSN/ISSBN:
0027-13291
DOI:
10.1073/pnas.2309504121
发表日期:
2024-02-20
关键词:
statistical-mechanics
UNIVERSALITY
摘要:
Graph neural networks (GNNs) excel in modeling relational data such as biological, statistical physics and random matrix theory to precisely characterize generalization in simple graph convolution networks on the contextual stochastic block model. Our results illuminate the nuances of learning on homophilic versus heterophilic data and predict double descent whose existence in GNNs has been questioned by recent work. We show how risk is shaped by the interplay between the graph noise, feature noise, and the number of training labels. Our findings apply beyond stylized models, capturing qualitative trends in real-world GNNs and datasets. As a case in point, we use our analytic insights to improve performance of state-of-the-art graph convolution networks on heterophilic datasets.