This study introduces a novel energy management model based on Deep Reinforcement Learning for IoT-based landslide early warning systems, aiming to achieve energy neutrality and enhance system resilience, efficiency, and sustainability. Unlike traditional energy optimization methods, the proposed model employs a Deep Q-Network (DQN) to dynamically optimize the duty cycle of sensor nodes by leveraging real-time energy availability. By adaptively balancing energy harvesting and consumption, sensor nodes can maintain continuous operation even under highly variable environmental conditions, maximizing their performance during high-energy periods while preserving battery life when energy is limited. Extensive simulations using real-world solar radiation data demonstrate the model's superior capability in extending system longevity and operational stability compared to existing approaches. Addressing critical energy management challenges in landslide monitoring systems, this work enhances system reliability, scalability, and adaptability, offering a robust foundation for broader IoT applications deployed in energy-limited and dynamic environments. The proposed method represents a significant improvement over conventional techniques, as it autonomously optimizes energy resources to ensure the continuous and sustainable operation of IoT ecosystems
Reinforcement learning internet of things energy management optimization landslide early warning systems
Primary Language | English |
---|---|
Subjects | Deep Learning, Neural Networks, Reinforcement Learning, Modelling and Simulation |
Journal Section | Research Articles |
Authors | |
Publication Date | December 31, 2024 |
Submission Date | November 20, 2024 |
Acceptance Date | December 27, 2024 |
Published in Issue | Year 2024 Issue: 059 |