Document Type
Conference Proceeding
Publication Title
Proceedings of Machine Learning Research
Abstract
Generating unlabeled data has been recently shown to help address the few-shot hypothesis adaptation (FHA) problem, where we aim to train a classifier for the target domain with a few labeled target-domain data and a well-trained source-domain classifier (i.e., a source hypothesis), for the additional information of the highly-compatible unlabeled data. However, the generated data of the existing methods are extremely similar or even the same. The strong dependency among the generated data will lead the learning to fail. In this paper, we propose a diversity-enhancing generative network (DEG-Net) for the FHA problem, which can generate diverse unlabeled data with the help of a kernel independence measure: the Hilbert-Schmidt independence criterion (HSIC). Specifically, DEG-Net will generate data via minimizing the HSIC value (i.e., maximizing the independence) among the semantic features of the generated data. By DEG-Net, the generated unlabeled data are more diverse and more effective for addressing the FHA problem. Experimental results show that the DEG-Net outperforms existing FHA baselines and further verifies that generating diverse data plays a vital role in addressing the FHA problem.
First Page
8260
Last Page
8275
Publication Date
7-2023
Keywords
Hilbert-schmidt independence criterions, Independence measure, Semantic features, Target domain, Unlabeled data
Recommended Citation
R. Dong et al., "Diversity-enhancing Generative Network for Few-shot Hypothesis Adaptation," Proceedings of Machine Learning Research, vol. 202, pp. 8260 - 8275, Jul 2023.
Comments
Preprint version from arXiv
Uploaded on June 21, 2024