In graph classification, the out-of-distribution (OOD) issue is attracting great attention. To address this issue, a prevailing idea is to learn stable features, on the assumption that they are substructures causally determining the label and that their relationship with the label is stable to the distributional uncertainty. In contrast, the complementary parts termed environmental features, fail to determine the label solely and hold varying relationships with the label, thus ascribed to the possible reason for the distribution shift. Existing generalization efforts mainly encourage the model's insensitivity to environmental features. While the sensitivity to stable features is promising to distinguish the crucial clues from the distributional uncertainty but largely unexplored. A paradigm of simultaneously exploring the sensitivity to stable features and insensitivity to environmental features is until-now lacking to achieve the generalizable graph classification, to the best of our knowledge. In this work, we conjecture that generalizable models should be sensitive to stable features and insensitive to environmental features. To this end, we propose a simple yet effective augmentation strategy for graph classification:
E
quivariant and
I
nvariant
C
ross-
D
ata
A
ugmentation (EI-CDA). By employing equivariance, given a pair of input graphs, we first estimate their stable and environmental features via masks. Then we linearly mix the estimated stable features of two graphs and encourage the model predictions faithfully reflect their mixed semantics. Meanwhile, by using invariance, we swap the estimated environmental features of two graphs and keep the predictions invariant. This simple yet effective strategy endows the models with both sensitivity to stable features and insensitivity to environmental features. Extensive experiments show that EI-CDA significantly improves performance and outperforms leading baselines. Our codes are available at: https://github.com/yongduosui/EI-GNN.