Classification and counting of cells in the blood is crucial for diagnosing and treating diseases in the clinic. A peripheral blood smear method is a fast, reliable, robust diagnostic tool for examining blood samples. However, cell overlap during the peripheral smear process may cause incorrectly predicted results in counting blood cells and classifying cell types. The overlapping problem can occur in automated systems and manual inspections by experts. Convolutional neural networks (CNN) provide reliable results for the segmentation and classification of many problems in the medical field. However, creating ground truth labels in the data during the segmentation process is time-consuming and error-prone. This study proposes a new CNN-based strategy to eliminate the overlap-induced counting problem in peripheral smear blood samples and accurately determine the blood cell type. In the proposed method, images of the peripheral blood were divided into sub-images, block by block, using adaptive image processing techniques to identify the overlapping cells and cell types. CNN was used to classify cell types and overlapping cell numbers in sub-images. The proposed method successfully counts overlapping erythrocytes and determines the cell type with an accuracy rate of 99.73\%. The results of the proposed method have shown that it can be used efficiently in various fields.
|Subjects||Computer Science, Interdisciplinary Application, Optics|
|Journal Section||Research Articles|
|Supporting Institution||Sakarya University of Applied Science Scientific Research Projects Coordination Unit|
|Thanks||This work was supported by Sakarya University of Applied Science Scientific Research Projects Coordination Unit (SUBU BAPK, Project Number: 2020-01-01-011). The author, Muhammed Ali PALA, is grateful to The Scientific and Technological Research Council of Turkey for granting a scholarship (TUBITAK, 2211C) for him Ph.D. studies.|
|Early Pub Date||July 30, 2022|
|Publication Date||July 30, 2022|
|Published in Issue||Year 2022, Volume 4, Issue 2|