Fast and Adversarial Robust Kernelized SDU Learning
Document Type
Conference Proceeding
Publication Title
Proceedings of Machine Learning Research
Abstract
SDU learning, a weakly supervised learning problem with only pairwise similarities, dissimilarities data points and unlabeled data available, has many practical applications. However, it is still lacking in defense against adversarial samples, and its learning process can be expensive. To address this gap, we propose a novel adversarial training framework for SDU learning. Our approach reformulates the conventional minimax problem as an equivalent minimization problem based on the kernel perspective, departing from traditional confrontational training methods. Additionally, we employ the random gradient method and random features to accelerate the training process. Theoretical analysis shows that our method can converge to a stationary point at a rate of O(1/T1/4). Our experimental results show that our algorithm is superior to other adversarial training methods in terms of generalization, efficiency and scalability against various adversarial attacks.
First Page
1153
Last Page
1161
Publication Date
1-1-2024
Recommended Citation
Y. Fan et al., "Fast and Adversarial Robust Kernelized SDU Learning," Proceedings of Machine Learning Research, vol. 238, pp. 1153 - 1161, Jan 2024.