TY - JOUR
T1 - Deep reinforcement learning for optimal microgrid energy management with renewable energy and electric vehicle integration
AU - Xiong, Baoyin
AU - Zhang, Lili
AU - Hu, Yang
AU - Fang, Fang
AU - Liu, Qingzhi
AU - Cheng, Long
PY - 2025/5
Y1 - 2025/5
N2 - The development and utilization of renewable energy sources (RES) are gaining unprecedented attention as a response to the environmental, economic, and energy security challenges posed by non-renewable fossil fuels. Nonetheless, integrating renewable energy into large-scale power grids remains a complex endeavour, which constrains the widespread adoption of these sustainable energy sources. Microgrid, which can function both autonomously and in coordination with the larger grid, provides an effective solution for integrating RES into the broader power system. To coordinate and optimize various energy resources within the microgrids to meet demand while maintaining stability and efficiency, the deployment of an Energy Management System (EMS) is crucial. This paper proposes a deep reinforcement learning (DRL)-based real-time optimal energy management method to assist the EMS for microgrids in making optimal scheduling decisions. Electric vehicles are introduced as a new controllable power source into the MG, alongside uncontrollable photovoltaic and wind power sources, resulting in an enhancement of the self-balance capability of the entire system. The efficacy of our proposed methodology is validated via a case study. Specifically, in comparison to the traditional energy scheduling approach, our method is found to enhance the self-balancing rate by a maximum of 22.97% and augment the operator's profit by as much as 33.74%. These results unequivocally demonstrate the superiority and practical value of our methodology in the relevant domain.
AB - The development and utilization of renewable energy sources (RES) are gaining unprecedented attention as a response to the environmental, economic, and energy security challenges posed by non-renewable fossil fuels. Nonetheless, integrating renewable energy into large-scale power grids remains a complex endeavour, which constrains the widespread adoption of these sustainable energy sources. Microgrid, which can function both autonomously and in coordination with the larger grid, provides an effective solution for integrating RES into the broader power system. To coordinate and optimize various energy resources within the microgrids to meet demand while maintaining stability and efficiency, the deployment of an Energy Management System (EMS) is crucial. This paper proposes a deep reinforcement learning (DRL)-based real-time optimal energy management method to assist the EMS for microgrids in making optimal scheduling decisions. Electric vehicles are introduced as a new controllable power source into the MG, alongside uncontrollable photovoltaic and wind power sources, resulting in an enhancement of the self-balance capability of the entire system. The efficacy of our proposed methodology is validated via a case study. Specifically, in comparison to the traditional energy scheduling approach, our method is found to enhance the self-balancing rate by a maximum of 22.97% and augment the operator's profit by as much as 33.74%. These results unequivocally demonstrate the superiority and practical value of our methodology in the relevant domain.
KW - Deep reinforcement learning
KW - Electric vehicles
KW - Microgrid
KW - PPO
KW - Uncertainties
U2 - 10.1016/j.asoc.2025.113180
DO - 10.1016/j.asoc.2025.113180
M3 - Article
AN - SCOPUS:105003715113
SN - 1568-4946
VL - 176
JO - Applied Soft Computing
JF - Applied Soft Computing
M1 - 113180
ER -