Subspace Identification for Multi-Source Domain Adaptation
Document Type
Conference Proceeding
Publication Title
Advances in Neural Information Processing Systems
Abstract
Multi-source domain adaptation (MSDA) methods aim to transfer knowledge from multiple labeled source domains to an unlabeled target domain.Although current methods achieve target joint distribution identifiability by enforcing minimal changes across domains, they often necessitate stringent conditions, such as an adequate number of domains, monotonic transformation of latent variables, and invariant label distributions.These requirements are challenging to satisfy in real-world applications.To mitigate the need for these strict assumptions, we propose a subspace identification theory that guarantees the disentanglement of domain-invariant and domain-specific variables under less restrictive constraints regarding domain numbers and transformation properties, thereby facilitating domain adaptation by minimizing the impact of domain shifts on invariant variables.Based on this theory, we develop a Subspace Identification Guarantee (SIG) model that leverages variational inference.Furthermore, the SIG model incorporates class-aware conditional alignment to accommodate target shifts where label distributions change with the domains.Experimental results demonstrate that our SIG model outperforms existing MSDA techniques on various benchmark datasets, highlighting its effectiveness in real-world applications.
Publication Date
1-1-2023
Recommended Citation
Z. Li et al., "Subspace Identification for Multi-Source Domain Adaptation," Advances in Neural Information Processing Systems, vol. 36, Jan 2023.