Leveraging Relational Graph Neural Network for Transductive Model Ensemble

Document Type

Conference Proceeding

Publication Title

Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining

Abstract

Traditional methods of pre-training, fine-tuning, and ensembling often overlook essential relational data and task interconnections. To address this gap, our study presents a novel approach to harnessing this relational information via a relational graph-based model. We introduce Relational grAph Model ensemBLE model, abbreviated as RAMBLE. This model distinguishes itself by performing class label inference simultaneously across all data nodes and task nodes, employing the relational graph in a transductive manner. This fine-grained approach allows us to better comprehend and model the intricate interplay between data and tasks. Furthermore, we incorporate a novel variational information bottleneck-guided scheme for embedding fusion and aggregation. This innovative technique facilitates the creation of an informative fusion embedding, honing in on embeddings beneficial for the intended task while simultaneously filtering out potential noise-laden embeddings. Our theoretical analysis, grounded in information theory, confirms that the use of relational information for embedding fusion allows us to achieve higher upper and lower bounds on our target task's accuracy. We thoroughly assess our proposed model across eight diverse datasets, and the experimental results demonstrate the model's effective utilization of relational knowledge derived from all pre-trained models, thereby enhancing its performance on our target tasks.

First Page

775

Last Page

787

DOI

10.1145/3580305.3599414

Publication Date

8-4-2023

Keywords

graph neural networks, information bottlenecks, transfer learning

Comments

IR conditions: non-described

Share

COinS