Next Article in Journal
Wheat Yield Prediction Using Machine Learning Method Based on UAV Remote Sensing Data
Previous Article in Journal
Photogrammetric Measurement of Grassland Fire Spread: Techniques and Challenges with Low-Cost Unmanned Aerial Vehicles
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Energy-Aware Hierarchical Reinforcement Learning Based on the Predictive Energy Consumption Algorithm for Search and Rescue Aerial Robots in Unknown Environments

College of Interdisciplinary Science and Technologies, University of Tehran, Tehran 14174-66191, Iran
*
Author to whom correspondence should be addressed.
Drones 2024, 8(7), 283; https://doi.org/10.3390/drones8070283
Submission received: 26 May 2024 / Revised: 12 June 2024 / Accepted: 13 June 2024 / Published: 23 June 2024

Abstract

Aerial robots (drones) offer critical advantages in missions where human participation is impeded due to hazardous conditions. Among these, search and rescue missions in disaster-stricken areas are particularly challenging due to the dynamic and unpredictable nature of the environment, often compounded by the lack of reliable environmental models and limited ground system communication. In such scenarios, autonomous aerial robots’ operation becomes essential. This paper introduces a novel hierarchical reinforcement learning-based algorithm to address the critical limitation of the aerial robot’s battery life. Central to our approach is the integration of a long short-term memory (LSTM) model, designed for precise battery consumption prediction. This model is incorporated into our HRL framework, empowering a high-level controller to set feasible and energy-efficient goals for a low-level controller. By optimizing battery usage, our algorithm enhances the aerial robot’s ability to deliver rescue packs to multiple survivors without the frequent need for recharging. Furthermore, we augment our HRL approach with hindsight experience replay at the low level to improve its sample efficiency.
Keywords: hierarchical reinforcement learning; long short-term memory networks; search and rescue mission; energy-efficient path planning hierarchical reinforcement learning; long short-term memory networks; search and rescue mission; energy-efficient path planning

Share and Cite

MDPI and ACS Style

Ramezani, M.; Amiri Atashgah, M.A. Energy-Aware Hierarchical Reinforcement Learning Based on the Predictive Energy Consumption Algorithm for Search and Rescue Aerial Robots in Unknown Environments. Drones 2024, 8, 283. https://doi.org/10.3390/drones8070283

AMA Style

Ramezani M, Amiri Atashgah MA. Energy-Aware Hierarchical Reinforcement Learning Based on the Predictive Energy Consumption Algorithm for Search and Rescue Aerial Robots in Unknown Environments. Drones. 2024; 8(7):283. https://doi.org/10.3390/drones8070283

Chicago/Turabian Style

Ramezani, M., and M. A. Amiri Atashgah. 2024. "Energy-Aware Hierarchical Reinforcement Learning Based on the Predictive Energy Consumption Algorithm for Search and Rescue Aerial Robots in Unknown Environments" Drones 8, no. 7: 283. https://doi.org/10.3390/drones8070283

Article Metrics

Back to TopTop