Ember: Energy management of batteryless event detection sensors with deep reinforcement learning
2020
Energy management can extend the lifetime of batteryless, energy-harvesting systems by judiciously utilizing the energy available. Duty cycling of such systems is especially challenging for event detection, as events arrive sporadically and energy availability is uncertain. If the node sleeps too much, it may miss important events; if it depletes energy too quickly, it will stop operating in low energy conditions and miss events. Ideally, the sensor should only turn on just before an event happens. We propose Ember, an energy management system based on deep reinforcement learning to duty cycle event-driven sensors in low energy conditions. We train a policy using historical real-world data traces of motion, temperature, humidity, pressure, and light events. The policy can learn to capture up to 95% of the events without depleting the node. As it may be difficult to obtain historical data for training a policy when deploying a node at a new location, we propose a self-supervised mechanism to collect ground-truth data while learning from the data at the same time. Ember learns to capture the majority of events within a week without any historical data and matches the performance of the policies trained with historical data in a few weeks. We deployed 40 nodes running Ember for indoor sensing and demonstrate that the learned policies generalize to real-world settings as well as outperform state-of-the-art techniques.
Research areas