Yongjian LuoChengxi LiuQiupin Lai
The uncertainties and intermittency of renewable generations and random fluctuating loads have brought great challenges to the operation of multi-energy microgrids (MEMGs). To deal with the uncertainties, the traditional meth-ods, such as the stochastic optimization and model predictive control, require accurate parametric model to describe the uncertainties. In this paper, a deep reinforcement learning meth-od based on the soft actor-critic (SAC) algorithm is proposed to achieve the optimal scheduling of MEMGs without need of the exact parameters about the uncertainties. Firstly, a deterministic optimal scheduling model is established without considering the uncertainties in an MEMG. Then, the established model is trans-ferred into as a Markov Decision Process (MDP) problem, where the uncertainties are involved in the state space, and the action space and reward function are designed based on the prior knowledge. Next, a SAC-based deep reinforcement learning method is proposed to solve the MDP problem. Finally, the simu-lation results on a demonstrative MEMG validate the effective-ness of the proposed method.
Feng DingGuanfeng MaZhikui ChenJing GaoPeng Li
Yingying ZhengHui WangJinglong WangZichong WangYongning Zhao