Dynamic Spiking Graph Neural Networks
Document Type
Conference Proceeding
Publication Title
Proceedings of the AAAI Conference on Artificial Intelligence
Abstract
The integration of Spiking Neural Networks (SNNs) and Graph Neural Networks (GNNs) is gradually attracting attention due to the low power consumption and high efficiency in processing the non-Euclidean data represented by graphs. However, as a common problem, dynamic graph representation learning faces challenges such as high complexity and large memory overheads. Current work often uses SNNs instead of Recurrent Neural Networks (RNNs) by using binary features instead of continuous ones for efficient training, which would overlooks graph structure information and leads to the loss of details during propagation. Additionally, optimizing dynamic spiking models typically requires propagation of information across time steps, which increases memory requirements. To address these challenges, we present a framework named Dynamic Spiking Graph Neural Networks (Dy-SIGN). To mitigate the information loss problem, Dy-SIGN propagates early-layer information directly to the last layer for information compensation. To accommodate the memory requirements, we apply the implicit differentiation on the equilibrium state, which does not rely on the exact reverse of the forward computation. While traditional implicit differentiation methods are usually used for static situations, Dy-SIGN extends it to the dynamic graph setting. Extensive experiments on three large-scale real-world dynamic graph datasets validate the effectiveness of Dy-SIGN on dynamic node classification tasks with lower computational costs.
First Page
16495
Last Page
16503
DOI
10.1609/aaai.v38i15.29587
Publication Date
3-25-2024
Recommended Citation
N. Yin et al., "Dynamic Spiking Graph Neural Networks," Proceedings of the AAAI Conference on Artificial Intelligence, vol. 38, no. 15, pp. 16495 - 16503, Mar 2024.
The definitive version is available at https://doi.org/10.1609/aaai.v38i15.29587