Robustly Train Normalizing Flows via KL Divergence Regularization
Document Type
Conference Proceeding
Publication Title
Proceedings of the AAAI Conference on Artificial Intelligence
Abstract
In this paper, we find that the training of Normalizing Flows (NFs) are easily affected by the outliers and a small number (or high dimensionality) of training samples. To solve this problem, we propose a Kullback–Leibler (KL) divergence regularization on the Jacobian matrix of NFs. We prove that such regularization is equivalent to adding a set of samples whose covariance matrix is the identity matrix to the training set. Thus, it reduces the negative influence of the outliers and the small sample number on the estimation of the covariance matrix, simultaneously. Therefore, our regularization makes the training of NFs robust. Ultimately, we evaluate the performance of NFs on out-of-distribution (OoD) detection tasks. The excellent results obtained demonstrate the effectiveness of the proposed regularization term. For example, with the help of the proposed regularization, the OoD detection score increases at most 30% compared with the one without the regularization.
First Page
15047
Last Page
15055
DOI
10.1609/aaai.v38i13.29426
Publication Date
3-25-2024
Recommended Citation
K. Song et al., "Robustly Train Normalizing Flows via KL Divergence Regularization," Proceedings of the AAAI Conference on Artificial Intelligence, vol. 38, no. 13, pp. 15047 - 15055, Mar 2024.
The definitive version is available at https://doi.org/10.1609/aaai.v38i13.29426