Aggregation Buffer: Revisiting DropEdge with a New Parameter Block

Lee D; Kong M; Hamid S; Lee C; Yoo J

Research article in edited proceedings (conference) | Peer reviewed

Abstract

We revisit DropEdge, a data augmentation technique for GNNs which randomly removes edges to expose diverse graph structures during training. While being a promising approach to effectively reduce overfitting on specific connections in the graph, we observe that its potential performance gain in supervised learning tasks is significantly limited. To understand why, we provide a theoretical analysis showing that the limited performance of DropEdge comes from the fundamental limitation that exists in many GNN architectures. Based on this analysis, we propose Aggregation Buffer, a parameter block specifically designed to improve the robustness of GNNs by addressing the limitation of DropEdge. Our method is compatible with any GNN model, and shows consistent performance improvements on multiple datasets. Moreover, our method effectively addresses well-known problems such as degree bias or structural disparity as a unifying solution. Code and datasets are available at https://github.com/dooho00/agg-buffer.

Details about the publication

EditorsSingh A; Fazel M; Hsu D; Lacoste-Julien S; Berkenkamp F; Maharaj T; Wagstaff K; Zhu J
Book titleProceedings of the 42nd International Conference on Machine Learning (Volume 267)
Page range33181-33204
PublisherMLResearchPress
Place of publication.
Title of seriesProceedings of Machine Learning Research
StatusPublished
Release year2025
ConferenceInternational Conference on Machine Learning (ICML) 2025, Vancouver, Canada
Link to the full texthttps://proceedings.mlr.press/v267/lee25n.html
KeywordsGraph neural networks, DropEdge, data augmentation, edge robustness, node classification, degree bias, structural disparity

Authors from the University of Münster

Hamid, Sagad
Junior professorship of practical computer science - modern aspects of data processing / data science (Prof. Braun)