Doubly Robust AUC Optimization against Noisy and Adversarial Samples
Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining
Area under the ROC curve (AUC) is an important and widely used metric in machine learning especially for imbalanced datasets. In current practical learning problems, not only adversarial samples but also noisy samples seriously threaten the performance of learning models. Nowadays, there have been a lot of research works proposed to defend the adversarial samples and noisy samples separately. Unfortunately, to the best of our knowledge, none of them with AUC optimization can secure against the two kinds of harmful samples simultaneously. To fill this gap and also address the challenge, in this paper, we propose a novel doubly robust dAUC optimization (DRAUC) algorithm. Specifically, we first exploit the deep integration of self-paced learning and adversarial training under the framework of AUC optimization, and provide a statistical upper bound to the AUC adversarial risk. Inspired by the statistical upper bound, we propose our optimization objective followed by an efficient alternatively stochastic descent algorithm, which can effectively improve the performance of learning models by guarding against adversarial samples and noisy samples. Experimental results on several standard datasets demonstrate that our DRAUC algorithm has better noise robustness and adversarial robustness than the state-of-the-art algorithms.
adversarial training, auc optimization, self-paced learning
C. Zhang et al., "Doubly Robust AUC Optimization against Noisy and Adversarial Samples," Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 3195 - 3205, Aug 2023.
The definitive version is available at https://doi.org/10.1145/3580305.3599316