Abstract
A reinforcement learning (RL) agent can capture energy system dynamics from historical data to improve its decision-making. However, the RL agent struggles to fully understand temporal patterns in scenarios with high fluctuations. This study addresses this gap by embedding temporal awareness into RL agents for energy system control. The proposed approach integrates generation and load time series into the state representation using a temporal embedding module based on long short-term memory (LSTM). The method is evaluated on highly variable patterns from the Netherlands to demonstrate the performance of temporal RL decision-making in different conditions.
| Original language | English |
|---|---|
| Title of host publication | E-ENERGY 2025 - Proceedings of the 2025 16th ACM International Conference on Future and Sustainable Energy Systems |
| Publisher | Association for Computing Machinery |
| Pages | 1002-1004 |
| Number of pages | 3 |
| ISBN (Electronic) | 9798400711251 |
| DOIs | |
| Publication status | Published - 16 Jun 2025 |
| Event | 16th ACM International Conference on Future and Sustainable Energy Systems, E-Energy 2025 - Rotterdam, Netherlands Duration: 17 Jun 2025 → 20 Jun 2025 Conference number: 16 |
Conference
| Conference | 16th ACM International Conference on Future and Sustainable Energy Systems, E-Energy 2025 |
|---|---|
| Abbreviated title | E-Energy |
| Country/Territory | Netherlands |
| City | Rotterdam |
| Period | 17/06/25 → 20/06/25 |
Keywords
- Energy System Control
- Long Short-Term Memory
- Reinforcement Learning
- Sequential Decision-making
- Temporal Awareness