Community energy storage operation via reinforcement learning with eligibility traces

Edgar Mauricio Salazar Duque*, Juan S. Giraldo, Pedro P. Vergara, Phuong H. Nguyen, Anne van der Molen, J.G. Slootweg

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

9 Citations (Scopus)
118 Downloads (Pure)

Abstract

The operation of a community energy storage system (CESS) is challenging due to the volatility of photovoltaic distributed generation, electricity consumption, and energy prices. Selecting the optimal CESS setpoints during the day is a sequential decision problem under uncertainty, which can be solved using dynamic learning methods. This paper proposes a reinforcement learning (RL) technique based on temporal difference learning with eligibility traces (ET). It aims to minimize the day-ahead energy costs while maintaining the technical limits at the grid coupling point. The performance of the RL is compared against an oracle based on a deterministic mixed-integer second-order constraint program (MISOCP). The use of ET boosts the RL agent learning rate for the CESS operation problem. The ET effectively assigns credit to the action sequences that bring the CESS to a high state of charge before the peak prices, reducing the training time. The case study shows that the proposed method learns to operate the CESS effectively and ten times faster than common RL algorithms applied to energy systems such as Tabular Q-learning and Fitted-Q. Also, the RL agent operates the CESS 94% near the optimal, reducing the energy costs for the end-user up to 12%.
Original languageEnglish
Article number108515
JournalElectric power systems research
Volume212
Early online date16 Jul 2022
DOIs
Publication statusPublished - 1 Nov 2022

Fingerprint

Dive into the research topics of 'Community energy storage operation via reinforcement learning with eligibility traces'. Together they form a unique fingerprint.

Cite this