MEDIA: An Incremental DNN Based Computation Offloading for Collaborative Cloud-Edge Computing

Document Type

Article

Publication Title

IEEE Transactions on Network Science and Engineering

Abstract

Mobile Cloud Computing (MCC) provides computing, storage, and other fruitful services to end users. Offloading such tasks to cloud servers can help to fulfill the demands of extensive computing resources, but may also lead to network congestion and high latency. Mobile Edge Computing (MEC) places the computing nodes near the end users to enable low-latency services, whereas it cannot execute too many computing tasks due to limited computing resources. Therefore, MCC and MEC are highly complementary. For computing offloading problems in a collaborative cloud-edge environment, traditional optimization algorithms require multiple iterations to obtain results, which leads to excessive time spent to obtain offloading strategies. Deep Neural Network (DNN) based offloading algorithms can provide low latency offloading strategies, but training data is difficult to be obtained and the cost of retraining is too high. Therefore, in this paper, we adopt an incremental training method to overcome the problem of insufficient training data and high retraining costs in DNN-based offloading algorithms. An incremental DNN-based computation offloading (MEDIA) algorithm is proposed to derive near-optimal offloading strategies for collaborative cloud-edge computing. The task information on the real scenarios is sent to the central cloud to generate training data, and the powerful computing resources of the central cloud improve the efficiency of training model. The continuous incremental training can maintain a high accuracy of the DNN model and reduce the time for training the model. The evaluation results demonstrate that the proposed algorithm substantially reduces the cost for updating the model without loss of performance.

First Page

1986

Last Page

1998

DOI

10.1109/TNSE.2023.3335345

Publication Date

11-28-2023

Keywords

Task analysis, Data models, Cloud computing, Training data, Training, Computational modeling, Costs

Comments

IR conditions: non-described

Share

COinS