This study proposes a weather-aware deep reinforcement learning (DRL) framework for predictive modelling of household energy dynamics. Using a 14-month high-resolution dataset from a residence in Northeast Mexico, the framework integrates detailed meteorological attributes and next-day forecasts to enhance prediction accuracy. Four DRL algorithms were implemented and evaluated for their performance in forecasting household energy consumption: Proximal Policy Optimisation (PPO), Soft Actor-Critic (SAC), Deep Deterministic Policy Gradient (DDPG), and Asynchronous Advantage Actor-Critic (A3C). Exploratory data analysis revealed significant seasonal trends and variability in energy usage patterns. Results show that DDPG and SAC outperform PPO and A3C, achieving the lowest root mean square error (RMSE) and mean absolute error (MAE), with DDPG recording 0.0011 RMSE and 0.0009 MAE. The framework was tested on moderately equipped hardware, demonstrating the practical feasibility of DRL-based energy forecasting systems. This work contributes original visualisations and comparative insights, advancing smart energy management solutions.
Deep Reinforcement Learning Household Energy Forecasting Weather Data Integration Predictive Modelling Smart Energy Systems
| Primary Language | English |
|---|---|
| Subjects | Artificial Intelligence (Other) |
| Journal Section | Research Article |
| Authors | |
| Submission Date | July 13, 2025 |
| Acceptance Date | December 30, 2025 |
| Publication Date | January 30, 2026 |
| DOI | https://doi.org/10.26650/d3ai.1741550 |
| IZ | https://izlik.org/JA33MX94SA |
| Published in Issue | Year 2026 Volume: 2 Issue: 1 |