This study presents EATS-MEC an intelligent MEC framework aimed at optimizing energy-aware task offloading and resource scheduling in ultra-dense network environments. The objective is to improve energy efficiency, scalability, and latency compliance under heterogeneous and mobile edge conditions. EATS-MEC integrates Deep Reinforcement Learning (DRL) for real-time task allocation and a lightweight blockchain module to ensure secure, decentralized execution across edge, fog, and cloud layers. Unlike classical models such as Deep Q-Networks (DQN) and Genetic Algorithms (GA), EATS-MEC adaptively responds to real-time network and mobility feedback to determine the optimal execution location for each task. Simulations demonstrate that EATS-MEC reduces peak energy consumption by 32%, extends device battery life by up to 20 hours, and achieves a task success rate of 88.3% under stringent deadline constraints. The framework shows superior performance in mobility-aware energy usage and exhibits near-sublinear energy scaling behavior with increasing device density, maintaining high task throughput even with over 100,000 concurrent tasks. Results indicate that EATS-MEC outperforms existing baselines in energy-latency trade-offs and operates close to the Pareto frontier. Due to its robust, secure, and adaptive nature, EATS-MEC is highly suitable for deployment in real-world smart city infrastructures, healthcare IoT, and latency-sensitive industrial applications.
Ran BiJiankang RenHao WangQian LiuXiuyuan Yang
Zhongjin LiVictor ChangJidong GeLinxuan PanHaiyang HuBinbin Huang
Ting LiHaitao LiuJie LiangHangsheng ZhangLiru GengYinlong Liu
Zhibo WangYunan SunDefang LiuJiahui HuXiaoyi PangYuke HuKui Ren