End-to-End Semi-Supervised Ordinal Regression AUC Maximization with Convolutional Kernel Networks
Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining
Convolutional kernel networks (CKN) have been proposed to solve image classification tasks, and have shown competitive performance over classical neural networks while being easy to train and robust to overfitting. In real-world ordinal regression problems, we usually have plenty of unlabeled data but a limited number of labeled ordered data. Although recent research works have shown that directly optimizing AUC can impose a better ranking on the data than optimizing traditional error rate, it is still an open question to design an efficient semi-supervised ordinal regression AUC maximization algorithm based on CKN with convergence guarantee. To address this question, in this paper, we propose a new semi-supervised ordinal regression CKN algorithm (S2 CKNOR) with end-to-end AUC maximization. Specifically, we decompose the ordinal regression into a series of binary classification subproblems and propose an unbiased non-convex objective function to optimize AUC, such that both labeled and unlabeled data can be used to enhance the model performance. Further, we propose a nested alternating minimization algorithm to solve the non-convex objective, where each (convex) subproblem is solved by a quadruply stochastic gradient algorithm, and the non-convex one is solved by the stochastic projected gradient method. Importantly, we prove that our S2 CKNOR algorithm can finally converge to a critical point of the non-convex objective. Extensive experimental results demonstrate that our S2 CKNOR achieves the best AUC results on various real-world datasets. © 2022 ACM.
auc, deep kernel network, semi-supervised ordinal regression, Convolution, Gradient methods, Stochastic systems
Z. Xiong, W. Shi, and B. Gu, "End-to-End Semi-Supervised Ordinal Regression AUC Maximization with Convolutional Kernel Networks", in Proc. of the ACM SIGKDD Intl. Conf. on Knowledge Discovery and Data Mining (KDD 2022), Washington, Aug 2022, pp. 2140 - 2150, doi: 10.1145/3534678.3539307