Federated Learning (FL) has become a transformative approach in machine learning, allowing decentralized training of models across multiple devices while preserving data privacy. This paradigm addresses critical concerns related to data privacy, security, and communication overhead, making it particularly relevant for applications in domains such as healthcare, finance, and the Internet of Things (IoT). Resource-constrained FL extends this concept to environments where computational, communication, and energy resources are limited, such as edge networks and IoT devices. This extension focuses on optimizing various aspects of the learning process to enable effective model training even in resource-limited settings. The primary aim of this survey is to provide a comprehensive and structured overview of the current state of research in FL and resource-constrained FL. By examining 62 key publications, this survey synthesizes insights and developments across these domains, highlighting advancements, challenges, and gaps that exist. This survey aims to provide a holistic view of the advancements and ongoing challenges in FL and resource-constrained FL. It identifies research gaps and proposes future directions, such as improving communication efficiency, developing adaptive learning algorithms, and enhancing resource management strategies. This survey serves as a valuable resource for researchers, practitioners, and stakeholders in the field, offering practical insights and guiding future exploration and innovation in FL and its applications in resource-constrained environments.
Decentralized Training Edge Computing Privacy-Preserving Algorithms IoT Applications Computational Efficiency
Primary Language | English |
---|---|
Subjects | Edge Computing, Deep Learning, Machine Learning (Other) |
Journal Section | Reviews |
Authors | |
Publication Date | December 20, 2024 |
Submission Date | August 19, 2024 |
Acceptance Date | November 7, 2024 |
Published in Issue | Year 2024 Volume: 1 Issue: 2 |