Research Article

Comparative Analysis of Open-Source Deep Learning Models in Terms of Energy Consumption, Computational Load, and Performance

Volume: 9 Number: 2 March 15, 2026
TR EN

Comparative Analysis of Open-Source Deep Learning Models in Terms of Energy Consumption, Computational Load, and Performance

Abstract

The energy consumption of deep learning models during training and inference processes has become an important performance indicator, especially for applications running on resource-constrained devices. Although there are significant differences in computational costs between different architectures, studies that comprehensively compare the energy efficiency of models are limited. In this study, six widely used models MobileNetV2, EfficientNet-B0, ResNet50, DenseNet121, Xception, and VGG19 were trained and evaluated under the same dataset and experimental settings. Real-time power measurements were performed on an RTX 2070 GPU to calculate each model's total energy consumption during training and inference, average power value, frames per second (FPS), and energy cost per image (J/image). The findings show that lightweight architectures are significantly more efficient: MobileNetV2 achieved the lowest energy consumption at 0.2289 J/image during inference, while EfficientNet-B0 offered balanced performance in terms of accuracy and energy usage. In contrast, VGG19 stood out as the least efficient model due to its high power requirements. The results reveal that model architecture has a direct impact on energy consumption and that model selection plays a critical role in the design of sustainable artificial intelligence systems.

Keywords

Ethical Statement

Ethics committee approval was not required for this study because it did not involve human participants or animal subjects.

Thanks

The author would like to thank the open-source contributors of the Imagenette dataset and the developers of Tensorflow and associated deep learning libraries used in this study. No additional administrative, technical, or material support was received.

References

  1. Aquino-Brítez, S., García-Sánchez, P., Ortiz, A., & Aquino-Brítez, D. (2025). Towards an energy consumption index for deep learning models: A comparative analysis of architectures, GPUs, and measurement tools. Sensors, 25(3), 846.
  2. Bouza, L., Bugeau, A., & Lannelongue, L. (2023). How to estimate carbon footprint when training deep learning models? A guide and review. Environmental Research Communications, 5(11), 115014.
  3. Bozkurt, A. (2024). Stanford HAI yapay zekâ raporu incelemesi. Bilgi Yönetimi, 7(2), 445–457.
  4. del Rey, S., Martínez-Fernández, S., Cruz, L., & Franch, X. (2023). Do DL models and training environments have an impact on energy consumption? 2023 49th Euromicro Conference on Software Engineering and Advanced Applications (SEAA), 150–158.
  5. Dey, S., Singh, A. K., Prasad, D. K., & McDonald-Maier, K. (2020). Temporal motionless analysis of video using CNN in MPSoC. 2020 IEEE 31st International Conference on Application-Specific Systems, Architectures and Processors (ASAP), 73–76.
  6. Getzner, J., Charpentier, B., & Günnemann, S. (2023). Accuracy is not the only metric that matters: Estimating the energy consumption of deep learning models. arXiv. https://doi.org/10.48550/arXiv.2304.00897
  7. Gowda, S. N., Hao, X., Li, G., Gowda, S. N., Jin, X., & Sevilla-Lara, L. (2024). Watt for what: Rethinking deep learning’s energy-performance relationship. European Conference on Computer Vision, 388–405.
  8. Howard, J., & Gugger, S. (2020). Fastai: A layered API for deep learning. Information, 11(2), 108.

Details

Primary Language

English

Subjects

Information Systems For Sustainable Development and The Public Good

Journal Section

Research Article

Publication Date

March 15, 2026

Submission Date

January 7, 2026

Acceptance Date

February 5, 2026

Published in Issue

Year 2026 Volume: 9 Number: 2

APA
Sancar, Y. (2026). Comparative Analysis of Open-Source Deep Learning Models in Terms of Energy Consumption, Computational Load, and Performance. Black Sea Journal of Engineering and Science, 9(2), 616-623. https://doi.org/10.34248/bsengineering.1858749
AMA
1.Sancar Y. Comparative Analysis of Open-Source Deep Learning Models in Terms of Energy Consumption, Computational Load, and Performance. BSJ Eng. Sci. 2026;9(2):616-623. doi:10.34248/bsengineering.1858749
Chicago
Sancar, Yasin. 2026. “Comparative Analysis of Open-Source Deep Learning Models in Terms of Energy Consumption, Computational Load, and Performance”. Black Sea Journal of Engineering and Science 9 (2): 616-23. https://doi.org/10.34248/bsengineering.1858749.
EndNote
Sancar Y (March 1, 2026) Comparative Analysis of Open-Source Deep Learning Models in Terms of Energy Consumption, Computational Load, and Performance. Black Sea Journal of Engineering and Science 9 2 616–623.
IEEE
[1]Y. Sancar, “Comparative Analysis of Open-Source Deep Learning Models in Terms of Energy Consumption, Computational Load, and Performance”, BSJ Eng. Sci., vol. 9, no. 2, pp. 616–623, Mar. 2026, doi: 10.34248/bsengineering.1858749.
ISNAD
Sancar, Yasin. “Comparative Analysis of Open-Source Deep Learning Models in Terms of Energy Consumption, Computational Load, and Performance”. Black Sea Journal of Engineering and Science 9/2 (March 1, 2026): 616-623. https://doi.org/10.34248/bsengineering.1858749.
JAMA
1.Sancar Y. Comparative Analysis of Open-Source Deep Learning Models in Terms of Energy Consumption, Computational Load, and Performance. BSJ Eng. Sci. 2026;9:616–623.
MLA
Sancar, Yasin. “Comparative Analysis of Open-Source Deep Learning Models in Terms of Energy Consumption, Computational Load, and Performance”. Black Sea Journal of Engineering and Science, vol. 9, no. 2, Mar. 2026, pp. 616-23, doi:10.34248/bsengineering.1858749.
Vancouver
1.Yasin Sancar. Comparative Analysis of Open-Source Deep Learning Models in Terms of Energy Consumption, Computational Load, and Performance. BSJ Eng. Sci. 2026 Mar. 1;9(2):616-23. doi:10.34248/bsengineering.1858749

                            24890