The Internet of Things (IoT) has revolutionized our lives by providing convenience in various aspects of our lives. However, for the IoT environment to function optimally, it is crucial to regularly collect data from IoT devices. This is because timely data collection enables more accurate evaluations and insights. Additionally, energy conservation is another crucial aspect to consider when collecting data, as it can have a significant impact on the sustainability of the IoT ecosystem. To achieve this, Unmanned Aerial Vehicles (UAVs) and Unmanned Ground Vehicles (UGVs) are increasingly being used to collect data. In this study, we delve into the problem of how UAVs and UGVs can effectively and efficiently collect data from IoT devices in an environment with obstacles. To address this challenge, we propose a Q-learning-based Obstacle Avoidance Data Harvesting (QOA-DH) method, which utilizes the principles of reinforcement learning to make decisions on data collection. Additionally, we conduct a comparison of the performance of UAVs and UGVs, considering the different restrictions and assumptions that are unique to each type of vehicle. This research aims to improve the overall efficiency and effectiveness of data collection in IoT environments and pave the way for sustainable IoT solutions.
Primary Language | English |
---|---|
Subjects | Computer Software |
Journal Section | Research Article |
Authors | |
Early Pub Date | July 6, 2023 |
Publication Date | June 30, 2023 |
Published in Issue | Year 2023 Volume: 13 Issue: 1 |
All articles published by EJT are licensed under the Creative Commons Attribution 4.0 International License. This permits anyone to copy, redistribute, remix, transmit and adapt the work provided the original work and source is appropriately cited.