Bluesky Facebook Reddit Email

Invariant graph learning meets information bottleneck for out-of-distribution generalization

02.05.26 | Higher Education Press

Apple iPhone 17 Pro

Apple iPhone 17 Pro delivers top performance and advanced cameras for field documentation, data collection, and secure research communications.

Graph out-of-distribution (OOD) generalization remains a major challenge in graph neural networks (GNNs). Invariant learning, aiming to extract invariant features across varied distributions, has recently emerged as a promising approach for OOD generalization. However, the exploration within graph data remains constrained by the complex nature of graphs. The invariant features at both the attribute and structural levels, combined with the absence of prior knowledge regarding environmental factors, make the invariance and sufficiency conditions of invariant learning hard to satisfy on graph data. Existing studies, such as data augmentation or causal intervention, either suffer from disruptions to invariance during the graph manipulation process or face reliability issues due to a lack of supervised signals for causal parts.

To solve the problems, a research team led by Wenyu Mao published their new research on 15 January 2026 in Frontiers of Computer Science co-published by Higher Education Press and Springer Nature.

The team propose a novel framework, called Invariant Graph Learning based on Information bottleneck theory (InfoIGL), to extract the invariant features of graphs and enhance models' generalization ability to unseen distributions. Specifically, InfoIGL introduces a redundancy filter to compress task-irrelevant information related to environmental factors. Cooperating with the designed multi-level contrastive learning, InfoIGL maximize the mutual information among graphs of the same class in the downstream classification tasks, preserving invariant features for prediction to a great extent. An appealing feature of InfoIGL is its strong generalization ability without depending on supervised signal of invariance.

Experiments on both synthetic and real-world datasets demonstrate that this method achieves state-of-the-art performance under OOD generalization for graph classification tasks.

Frontiers of Computer Science

10.1007/s11704-025-40798-3

Experimental study

Not applicable

Invariant graph learning meets information bottleneck for out-of-distribution generalization

15-Jan-2026

Keywords

Article Information

Contact Information

Rong Xie
Higher Education Press
xierong@hep.com.cn

Source

How to Cite This Article

APA:
Higher Education Press. (2026, February 5). Invariant graph learning meets information bottleneck for out-of-distribution generalization. Brightsurf News. https://www.brightsurf.com/news/147PMZO1/invariant-graph-learning-meets-information-bottleneck-for-out-of-distribution-generalization.html
MLA:
"Invariant graph learning meets information bottleneck for out-of-distribution generalization." Brightsurf News, Feb. 5 2026, https://www.brightsurf.com/news/147PMZO1/invariant-graph-learning-meets-information-bottleneck-for-out-of-distribution-generalization.html.