In
this paper, we propose a multimodal biometric face recognition technique which
is mainly based on the 2D Discrete
Wavelet
Transform (DWT) and Data Fusion (DF) and utilizes data fusion techniques at the
score level of the system algorithm. The technique employs three discrete
unimodal feature extraction and classification methods. The first two feature
vectors are generated from raw images by using Principal Component Analysis
(PCA) and Local Binary Pattern (LBP) methods. During the generation of the
third feature vector, images are initially transformed into the DWT domain. In
result, approximation, vertical, horizontal and diagonal detail matrices are
combined to form a Joint Feature Vector (JFV). K-Nearest Neighbor (KNN)
classifier algorithm is separately applied to the three generated feature
vectors to compute different score values for the same individual. These raw
score values are fused together using a newly proposed data fusion technique
based on Manhattan Distance Penalty Weighting (MDPW). The proposed MDPW
penalizes an individual for scoring low points and further pushes it away from
the potentially winning class before data fusion is conducted. The proposed
approach was implemented on ORL and YALE public face databases. The results of
the proposed approach are evaluated using the recognition rates and receiver
operating characteristics of the biometric classification systems. Experimental
results show that the proposed multimodal system performs better than the unimodal
system and other multimodal systems that use different data fusion rules (e.g.
Sum Rule or Product Rule). In ORL database, the recognition rate of up to 97%
can be achieved using the proposed technique.
Face recognition discrete wavelet transform principal component analysis local binary pattern data fusion Manhattan distance k-nearest neighbor (KNN)
Subjects | Engineering |
---|---|
Journal Section | Articles |
Authors | |
Publication Date | July 27, 2017 |
Published in Issue | Year 2017 Volume: 7 Issue: 3 |