Autor: |
Huanhuan Nie, Ying Chen, Yue Xia, Shaowei Huang, Bingqian Liu |
Jazyk: |
angličtina |
Rok vydání: |
2020 |
Předmět: |
|
Zdroj: |
IEEE Access, Vol 8, Pp 153455-153469 (2020) |
Druh dokumentu: |
article |
ISSN: |
2169-3536 |
DOI: |
10.1109/ACCESS.2020.3018142 |
Popis: |
Extreme disasters may cause the power supply to the distribution system (DS) to be interrupted. The DS is forced to operate in island mode and forms an islanded microgrid (MG). In order to improve the post-disaster resilience of the DS and to provide longer power supply for as many loads as possible with limited generation resources, this paper proposes a multi-agent deep reinforcement learning (DRL) method which realizes a dual control on the source and load sides of the MG. The problem of resilience improvement is converted to a sequential decision making problem, where the objective is to maximize the cumulative MG utility value over the power outage duration. A multi-agent DRL model is proposed to solve the sequential decision making problem. A dual control policy including energy storage management and load shedding strategy is put forward to maximize the utility value of the MG. A reinforcement learning (RL) environment based on OpenAI and OpenDSS for islanded MG is constructed as a simulator, which has a general interface compatible with, and also can be published to, OpenAI Gym. Numerical simulations are performed for an MG equipped with wind turbines, diesel generators, and storage devices to validate the effectiveness of the proposed method. The influences of available generation resources and power outage duration on the control policy are discussed, which validates the strong adaptability of the proposed method in different conditions. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|