Latent Domain Generation for Unsupervised Domain Adaptation Object Counting

Document Type


Publication Title

IEEE Transactions on Multimedia


Unsupervised cross-domain object counting has recently received great attention in computer vision, which generalizes the model from the source domain to the unlabeled target domain. However, it is an extremely challenging task because only unlabeled data is available from the target domain and the domain gap between two domains is implicit in object counting. In this paper, we propose a latent domain generation method to improve the generalization ability of unsupervised domain adaptation object counting by generating a latent domain. To this end, we propose a domain generator with random perturbations to learn a new latent distribution derived from the original source distribution. The latent domain generator can extract target information sampled in its stochastic latent representation, which preserves the original target information and enhances the diverse ability. Meanwhile, to ensure that the generated latent domain is consistent with the source domain in counting performance, we introduce a consistency loss to encourage similar output from latent and source domains. Moreover, to enhance the adaptation ability of the generated latent domain, we apply the adversarial loss to achieve alignment between the latent and target domains. The domain generator with the adversarial loss and consistency loss ensures that the generated domain is aligned to the target while also improving the robustness of the original source domain model. The experiment indicates that our framework can effortlessly extend to scenarios with different objects (crowd, cars). The experiments also demonstrate the effectiveness of our method on unsupervised realistic-to-realistic crowd counting problems.

First Page


Last Page




Publication Date



domain adaptation, Object counting, unsupervised learning


IR Deposit conditions:

OA version (pathway a) Accepted version

No embargo

When accepted for publication, set statement to accompany deposit (see policy)

Must link to publisher version with DOI

Publisher copyright and source must be acknowledged