Energy Aware Deep Reinforcement Learning Scheduling for Sensors Correlated in Time and Space

Jernej Hribar, Andrei Marinescu, Alessandro Chiumento, Luiz A DaSilva

Research output: Working paper

125 Downloads (Pure)

Abstract

Millions of battery-powered sensors deployed for monitoring purposes in a multitude of scenarios, e.g., agriculture, smart cities, industry, etc., require energy-efficient solutions to prolong their lifetime. When these sensors observe a phenomenon distributed in space and evolving in time, it is expected that collected observations will be correlated in time and space. In this paper, we propose a Deep Reinforcement Learning (DRL) based scheduling mechanism capable of taking advantage of correlated information. We design our solution using the Deep Deterministic Policy Gradient (DDPG) algorithm. The proposed mechanism is capable of determining the frequency with which sensors should transmit their updates, to ensure accurate collection of observations, while simultaneously considering the energy available. To evaluate our scheduling mechanism, we use multiple datasets containing environmental observations obtained in multiple real deployments. The real observations enable us to model the environment with which the mechanism interacts as realistically as possible. We show that our solution can significantly extend the sensors' lifetime. We compare our mechanism to an idealized, all-knowing scheduler to demonstrate that its performance is near-optimal. Additionally, we highlight the unique feature of our design, energy-awareness, by displaying the impact of sensors' energy levels on the frequency of updates.
Original languageEnglish
PublisherArXiv.org
Publication statusPublished - 19 Nov 2020

Fingerprint

Dive into the research topics of 'Energy Aware Deep Reinforcement Learning Scheduling for Sensors Correlated in Time and Space'. Together they form a unique fingerprint.

Cite this