Towards Instance-adaptive Inference for Federated Learning
Document Type
Conference Proceeding
Publication Title
Proceedings of the IEEE International Conference on Computer Vision
Abstract
Federated learning (FL) is a distributed learning paradigm that enables multiple clients to learn a powerful global model by aggregating local training. However, the performance of the global model is often hampered by non-i.i.d. distribution among the clients, requiring extensive efforts to mitigate inter-client data heterogeneity. Going beyond inter-client data heterogeneity, we note that intra-client heterogeneity can also be observed on complex real-world data and seriously deteriorate FL performance. In this paper, we present a novel FL algorithm, i.e., FedIns, to handle intra-client data heterogeneity by enabling instance-adaptive inference in the FL framework. Instead of huge instance-adaptive models, we resort to a parameter-efficient fine-tuning method, i.e., scale and shift deep features (SSF), upon a pre-trained model. Specifically, we first train an SSF pool for each client, and aggregate these SSF pools on the server side, thus still maintaining a low communication cost. To enable instance-adaptive inference, for a given instance, we dynamically find the best-matched SSF subsets from the pool and aggregate them to generate an adaptive SSF specified for the instance, thereby reducing the intra-client as well as the inter-client heterogeneity. Extensive experiments show that our FedIns outperforms state-of-the-art FL algorithms, e.g., a 6.64% improvement against the top-performing method with less than 15% communication cost on Tiny-ImageNet.
First Page
23230
Last Page
23239
DOI
10.1109/ICCV51070.2023.02128
Publication Date
1-1-2023
Recommended Citation
C. Feng et al., "Towards Instance-adaptive Inference for Federated Learning," Proceedings of the IEEE International Conference on Computer Vision, pp. 23230 - 23239, Jan 2023.
The definitive version is available at https://doi.org/10.1109/ICCV51070.2023.02128