Yinglei TengYan MeiDan LiuZhu HanMei Song
The rapid expansion of Machine Type Communication (MTC) has brought many challenges, such as sporadic uplink transmission congestion and energy consumption of battery limited nodes. To address these challenges, we consider a green energy harvesting massive MTC (mMTC) system that each device is equipped with a local cache and a rechargeable battery to store the collecting data and the green energy harvested from the environment, respectively. Then, an uplink power control method is proposed to regulate the transmission while simultaneously minimizing the system's delay cost and the battery depreciation cost. The optimization problem with the duplex queueing structure is formulated as a Markov decision process (MDP), and we utilize the Deep Distributed Recurrent Q-networks (DDRQN) algorithm to manage the complex dynamic channel, data, and energy environment through the partially observable state. Our simulation results show that the proposed learning scheme can dramatically reduce the system cost, prolong the lifetime of the mMTC network, and approach the conventional centralized reinforcement learning method.
Jiajia LiuZhenjiang ShiShangwei ZhangNei Kato
Muhammad Basit ShahabSarah J. JohnsonMahyar ShirvanimoghaddamMarwa ChafiiErtuğrul BaşarMischa Döhler
Hossein MehriHani MehrpouyanHao Chen
Xiaoyan KuaiXiaojun YuanWenjing YanYing‐Chang Liang
Ao GuoL. ChenHuarui YinW. Wang