Mobile Edge Computing(MEC) has become a new paradigm to reduce the deep neural network(DNN) inference latency for resource-limited mobile devices by collaborating with edge servers. However, mobile devices' movement makes the collaborative edge site difficult to determine and the limited battery makes the device unable to work continuously. Hence, we propose a DNN collaborative inference framework in MEC, where mobile devices are equipped with energy harvesting(EH) equipment. In order to better capture energy harvesting uncertainty and task arrival randomness, we formulate the problem as a constrained Markov decision process(CMDP). Specifically, energy harvesting strategy, edge site-selection and DNN partition strategy are jointly considered to minimize the average inference latency while keeping each mobile device battery energy level stable. As for CMDP problem, general deep reinforcement learning(DRL) cannot solved it directly. We first transform the CMDP into a Markov decision process(MDP) by leveraging the Lyapunov optimization technique. Then, we decompose the MDP problem into two sub-problems: energy harvesting strategy(EHS) and DNN collaborative inference(DNNCI). We propose an algorithm based on DRL to solve DNNCI problem. Extensive simulation results are provided to demonstrate that our algorithm achieves outstanding performance over existing benchmarks and can stabilize the battery level of mobile devices.
Jiaqi WuLin HuangHuaize LiuLin Gao
DeHua ChenQinghe DongQian HeYiming Lin
Yilin XiaoLiang XiaoKunpeng WanHelin YangYi ZhangYi WuYanyong Zhang
DeHua ChenQinghe DongQian HeBingcheng Jiang
Jienan ChenSiyu ChenQi WangBin CaoGang FengJianhao Hu