JOURNAL ARTICLE

Resource Allocation in Multi-access Edge Computing: Optimization and Machine Learning

Xian Liu

Year: 2021 Journal:   2021 IEEE 12th Annual Information Technology, Electronics and Mobile Communication Conference (IEMCON) Pages: 0365-0370

Abstract

Multi-access edge computing (MEC) equipped with artificial intelligence is a promising technology in B5G wireless systems. Some refined investigations and analysis are needed to gain more insights. This paper addresses that the core concept could be stemmed from the wait-and-see model in stochastic programming and indicates the quasi-separable property. Moreover, both small-scale fading and pathloss issues are included in the investigations. Two aspects of this study are the optimization model itself, followed by the simulation with machine learning (ML). One of the main interests of using ML is in improving the computational efficiency. Simulations showed that the efficiency may be improved from 93% to 96%.

Keywords:
Computer science Enhanced Data Rates for GSM Evolution Edge computing Resource allocation Wireless Core (optical fiber) Property (philosophy) Artificial intelligence Distributed computing Mathematical optimization Computer network Telecommunications

Metrics

8
Cited By
2.18
FWCI (Field Weighted Citation Impact)
8
Refs
0.88
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

IoT and Edge/Fog Computing
Physical Sciences →  Computer Science →  Computer Networks and Communications
IoT Networks and Protocols
Physical Sciences →  Engineering →  Electrical and Electronic Engineering
Advanced Wireless Communication Technologies
Physical Sciences →  Engineering →  Electrical and Electronic Engineering

Related Documents

© 2026 ScienceGate Book Chapters — All rights reserved.