Research Article
BibTex RIS Cite

A new classification-based approach for multi-focus image fusion

Year 2024, Volume: 42 Issue: 1, 11 - 25, 27.02.2024

Abstract

Multi-focus image fusion combines two or more images of the same scene with different focus points to create a single detailed fully-focused image. The primary purpose of multi-focus im-age fusion methods is to transfer the correct focus information from the source images to the fused image. This study proposes a new classification mechanism based on focus metrics. This mechanism is designed to classify focused, non-focused and ambiguous regions. The most im-portant feature of the proposed mechanism is that it can detect ambiguous areas and transfer these regions to the fused image correctly. Firstly, each source image is split into non-over-lapping image patches of specified sizes in this study. Then, the generated image patches are classified using created classification mechanism. After the classification process, a decision map is created for each source image. These decision maps are then refined using morpho-logical operations. In the final stage of the designed study, a dynamic fusion rule is proposed. This fusion rule transfers focused and non-focused pixels to the fused image according to a specific rule. In contrast, ambiguous regions, frequently encountered in transitions from fo-cused to non-focused areas, are transferred to the fused image using the gradient-based fusion rule. In this way, the negative effect of the regions that the classification algorithms classified incorrectly on the fused image is reduced. In addition, in this study, the impact of image size on image fusion success is analyzed by using different image sizes in the classification mecha-nism. As a result, the proposed study is evaluated using objective and subjective metrics. The evaluations show that the proposed method is suitable for achieving multi-focus image fusion purposes.

References

  • REFERENCES
  • [1] Li S, Kwok JT, Wang Y. Combination of images with diverse focuses using the spatial frequency. Inf Fusion 2001;2:169–176. [CrossRef]
  • [2] Huang W, Jing Z. Evaluation of focus measures in multi-focus image fusion. Pattern Recognit Lett 2007;28:493–500. [CrossRef]
  • [3] Agrawal D, Singhai J. Multifocus image fusion using modified pulse coupled neural network for improved image quality. IET Image Process 2010;4:443–451. [CrossRef]
  • [4] Jin X, Zhou D, Yao S, Nie R, Jiang Q, He K, et al. Multi-focus image fusion method using S-PCNN optimized by particle swarm optimization. Soft Comput 2017;22:6395–6407. [CrossRef]
  • [5] Hua K, Wang H, Rusdi AH, Jiang S. A novel multi-focus image fusion algorithm based on random walks. J Vis Commun Image Represent 2014;25:951–962. [CrossRef]
  • [6] Ma J, Zhou Z, Wang B, Miao L, Zong H. Multi-focus image fusion using boosted random walks-based algorithm with two-scale focus maps. Neurocomputing 2019;335:9–20. [CrossRef]
  • [7] Yan X, Qin H, Li J, Zhou H, Yang T. Multi-focus image fusion using a guided–filter-based difference image. Appl Opt 2016;55:2230–2239. [CrossRef]
  • [8] Qiu X, Li M, Zhang L, Yuan X. Guided filter-based multi-focus image fusion through focus region detection. Signal Process Image Commun 2019;72:35–46. [CrossRef]
  • [9] Xiao J, Liu T, Zhang Y, Zou B, Lei J, Li Q. Multi-focus image fusion based on depth extraction with inhomogeneous diffusion equation. Signal Process 2016;125:171–186. [CrossRef]
  • [10] Bouzos O, Andreadis I, Mitianoudis N. Conditional random field model for robust multi-focus image fusion. IEEE Trans Image Process 2019;28:5636–5648. [CrossRef]
  • [11] Li Q, Yang X, Wu W, Liu K, Jeon G. Multi-focus image fusion method for vision sensor systems via dictionary learning with guided filter. Sensors 2018;18:2143. [CrossRef]
  • [12] Nejati M, Samavi S, Shirani S. Multi-focus image fusion using dictionary-based sparse representation. Inf Fusion 2015;25:72–84. [CrossRef]
  • [13] Yang Y, Yang M, Huang S, Ding M, Sun J. Robust sparse representation combined with adaptive PCNN for Multifocus image fusion. IEEE Access 2018;6:20138–20151. [CrossRef]
  • [14] Burt PJ, Adelson EH. Merging images through pattern decomposition. SPIE Proc 1985:173–181. [CrossRef]
  • [15] Jin X, Hou J, Nie R, Yao S, Zhou D, Jiang Q, et al. A lightweight scheme for multi-focus image fusion. Multimed Tools Appl 2018;77:23501–23527. [CrossRef]
  • [16] Kou L, Zhang L, Zhang K, Sun J, Han Q, Jin Z. A multi-focus image fusion method via region mosaicking on Laplacian pyramids. PLoS ONE 2018;13:e0191085. [CrossRef]
  • [17] Tian J, Chen L. Adaptive multi-focus image fusion using a wavelet-based statistical sharpness measure. Signal Process 2012;92:2137–2146. [CrossRef]
  • [18] Aymaz S, Köse C. A novel image decomposition-based hybrid technique with super-resolution method for multi-focus image fusion. Inf Fusion 2019;45:113–127. [CrossRef]
  • [19] Chai Y, Li H, Guo M. Multifocus image fusion scheme based on features of multiscale products and PCNN in lifting stationary wavelet domain. Opt Commun 2011;284:1146–1158. [CrossRef]
  • [20] Li S, Kwok J, Tsang I, Wang Y. Fusing images with different focuses using support vector machines. IEEE Trans Neural Netw 2004;15:1555–1561. [CrossRef]
  • [21] Yu B, Jia B, Ding L, Cai Z, Wu Q, Law R, et al. Hybrid dual-tree complex wavelet transform and support vector machine for digital multi-focus image fusion. Neurocomputing 2016;182:1–9. [CrossRef]
  • [22] Saeedi J, Faez K. A classification and fuzzy-based approach for digital multi-focus image fusion. Pattern Anal Appl 2011;16:365–379. [CrossRef]
There are 23 citations in total.

Details

Primary Language English
Subjects Biochemistry and Cell Biology (Other)
Journal Section Research Articles
Authors

Samet Aymaz This is me 0000-0003-0735-0487

Şeyma Aymaz This is me 0000-0002-8978-4459

Cemal Köse This is me 0000-0002-5982-4771

Publication Date February 27, 2024
Submission Date February 13, 2022
Published in Issue Year 2024 Volume: 42 Issue: 1

Cite

Vancouver Aymaz S, Aymaz Ş, Köse C. A new classification-based approach for multi-focus image fusion. SIGMA. 2024;42(1):11-25.

IMPORTANT NOTE: JOURNAL SUBMISSION LINK https://eds.yildiz.edu.tr/sigma/