Keywords
|
palmprints, orientation field, region of interest, Euclidian distance. |
I.INTRODUCTION:
|
Biometrics helps to provide the identity of the user based on the physiological or behavioural characteristic of the person. Every biometric technology has its own merits and limitations. Thus, there is no system exist which can be considered as the best for all applications [1]. One of the well known biometrics systems having very high accuracy is iris based system [2]. But iris acquisition system is very expensive and has high failure to enrolment rate. Also it requires very high cooperation from user. Fingerprint based systems are most widely used in the world because of its simplicity, low cost and good accuracy. Small amounts of dirt or grease on the finger may affect the performance of fingerprint based system. Hand geometry based system suffers from high cost and low accuracy. The ear based recognition has a problem of ear being partially or fully occluded due to hair or cap [3]. Face based recognition system is low cost requiring only a camera mounted in a suitable position such as the entrance of a physical access control area. However, face based systems are less acceptable than fingerprint based systems [4]. |
Palmprint is the region between wrist and fingers and has features like principle lines, wrinkles, datum points, delta point, ridges, minutiae points, singular points and texture pattern that can be considered as biometric characteristics. Compared to other biometric systems, palmprint based identification system has many advantages: 1) Features of the human hand are relatively stable and unique. 2) It needs very less co-operation from users for data acquisition. 3) Collection of data is non-intrusive. 4) Low cost devices are sufficient to acquire good quality of data. 5) The system uses low resolution images but provides high accuracy. 6) Compared to the fingerprint, a palmprint provides a larger surface area so that more features can be extracted. 7) Because of the use of lower resolution imaging sensor to acquire palmprint, the computation is much faster at the pre-processing and feature extraction stages. 8) System based on hand features is found to be most acceptable. 9) palmprint also serves as a reliable human identifier because the print patterns are not found to be duplicated even in mono-zygotic twins [5]. |
Palmprint based systems make use of structural features, statistical features and multiple combinations. The structural features of palmprint include principle lines, wrinkles, datum points, minutiae points, ridges and crease points. C. Han et al [6] used Sobel and morphological operations to extract line-like features from palmprints. N. Duta et al [7] used isolated points along the principle lines as the features. A system based on ridges of the palmprint eliminating creases has been proposed by J. Funada et al [8]. D. Zhang et al [9] used end points of principle lines referred as datum points. These datum points used as the features found to be location and directional invariant. J. Chen et al [10] proposed a palmprint based system that uses crease points. X. Wu et al [11] considered directional line energy features which arePalmprint based systems make use of structural features, statistical features and multiple combinations. The structural features of palmprint include principle lines, wrinkles, datum points, minutiae points, ridges and crease points. C. Han et al [6] used Sobel and morphological operations to extract line-like features from palmprints. N. Duta et al [7] used isolated points along the principle lines as the features. A system based on ridges of the palmprint eliminating creases has been proposed by J. Funada et al [8]. D. Zhang et al [9] used end points of principle lines referred as datum points. These datum points used as the features found to be location and directional invariant. J. Chen et al [10] proposed a palmprint based system that uses crease points. X. Wu et al [11] considered directional line energy features which are characterised with the help of crease points for identification of palmprint. Like fingerprint, each palmprint also contains ridges and minutiae which can be used for matching palmprint images [12]. The statistical features of palmprint include Principle Component Analysis [13], Linear Discriminant Analysis [14], Independent Component Analysis [15], Fourier Transforms [16], Gabor filter [17], fusion code [18], competitive code [19], ordinal code [20] and Wavelets [21] etc. Fusion of palmprint features with other traits like fingerprint [22], palm veins [23], hand geometry [24], face [25], and iris [26] to improve accuracy of the system have been successfully attempted by various researchers. |
II.PREPROCESSING OF THE PALMPRINTS:
|
In order to make the proposed algorithm rotation and translation invariant, it is necessary to obtain the ROI from the captured palmprint image, prior to feature extraction. The adapted procedure for extraction of ROI is similar to the procedure described for the standard database of PolyU that is available online. Five major steps of palmprint image pre-processing to extract the ROI are as follows: |
Step 1: Convolve the captured palmprint image with a low-pass filter. Convert this convolved imprint into a binary, by using a threshold value. This transformation can be represented as, |
|
where, B(x,y) and O(x,y) are the binary image and the original image, respectively; |
L(x,y) is a lowpass filter, such as Gaussian, and“*” represents an operator of convolution. |
Step 2: Extract the boundaries of the holes, (Fixj,Fiyj) (i=1,2), between fingers using a boundary-tracking algorithm. The start points, (Sxi,Syi), and end points, (Exi,Eyi) of the holes are then marked in the process. |
Step 3: Compute the center of gravity, (Cxi,Cyi), of each hole with the following equations: |
|
where, M(i) represents the number of boundary points in the hole, i. Then construct a line that passes through (Cxi, Cyi) and the midpoint of (Sxi, Syi) and (Exi, Eyi). The line equation is defined as, |
|
where, (Mxi, Myi) is the midpoint of (Sxi, Syi) and (Exi, Eyi). |
Based on these lines, two key points, (k1, k2), can easily be detected. |
Step 4: Line up k1 and k2 to get the Y-axis of the palmprint coordinate system and make a line through their midpoint which is perpendicular to the Y-axis, to determine the origin of the coordinate system. This coordinate system can align different palmprint images. |
Step 5: Extract a sub-image with the fixed size on the basis of coordinate system, which is located at the certain part of the palmprint for feature extraction. |
III.FEATURE EXTRACTION OF THE PALMPRINTS USING ORIENTATION FIELD:
|
Orientation field of the palmrprint image defines the local orientation of the ridges contained in the palmprint. The steps for calculating the orientation at pixel (i, j) are as follows: |
Step 1: Initially consider a block of size of W×W centred at pixel (i, j) in the normalised palmprint image. |
Step 2: For each pixel in the block, compute the gradients δx (i, j) and δy (i, j), which are the gradient magnitudes in the x and y directions, respectively. The horizontal Sobel operator has been used to compute δx (i, j) and is defined as, |
|
The vertical Sobel operator has been used to compute δy (i, j) and is defined as, |
|
Step 3: The local orientation at pixel (i, j) has been estimated using, |
|
where θj(i, j) is the least square estimate of the local orientation at the block centred at pixel (i, j). Step 4: Smooth the orientation field in a local neighbourhood using a Gaussian filter. The orientation image is firstly converted into a continuous vector field, which is defined as: |
|
where Φx and Φy are the x and y components of the vector field, respectively. After the vector field has been computed, Gaussian smoothing is then performed as follows: |
|
where G is a Gaussian low-pass filter of size wΦ£ wΦ. |
The final smoothed orientation field O at pixel (i; j) is defined as: |
|
The variance feature vector of orientation field is computed and is treated as a template or feature map. |
|
where μkis a mean of kth column of O (i, j ), |
m and n are number of columns and rows of O, respectively |
The feature vector is given by |
|
IV.PALMPRINT IMAGE MATCHING
|
The variance feature vector of orientation field for query palmprint image has been computed by using the same steps as described earlier (4)-(11). Matching the variance feature vector of orientation field of query and the template image from the stored database has been carried by using the L2 norm. |
|
where and v are the feature vectors of query and template palmprint images, respectively. |
V.RESULTS AND DISCUSSIONS
|
We have used the PolyU database available from Hong Kong Polytechnic University (PolyU) which consists of 7752 grayscale images from 193 users corresponding to 386 different palms. Around 17 images per palm are collected in two sessions. The images are collected using CCD at spatial resolution of 75 dots per inch, and 256 gray-levels. Images are captured placing pegs. Sample of PolyU hand image, the extracted palmprint and the orientation field is shown in Figure1. For experimentation purpose the database is classified into training set and testing set. Six images per palm are considered for training (selected randomly) and remaining images are used for testing. |
|
|
The performance of algorithm has been measured in terms of false acceptance rate (FAR) and false rejection rate (FRR) for various thresholds. The FAR and FRR are computed as: let N is the number of subjects with 17 palmprints each. Therefore total number of palmprint images in the database are T = 17 × N. A single template per subject has been considered for experimentation. Total trials carried out for finding true claims and imposter claims are N × (T − 6), out of which total true claims are N × 11 and imposter claims are (total trials − true claims), using this we can get, |
FRR = (true claims rejected/total true claims) × 100%, |
FAR = (imposter claims accepted/total imposter claims) × 100% and |
GAR = 100 − FRR in percentage |
For every possible combination the algorithm has been tested for computation of FAR and FRR at different thresholds that can be plotted as in Figure 2. |
References
|
- D. D. Zhang, “Palmprint Authentication (International Series on Biometrics),” Springer-Verlag New York, Inc., Secaucus, NJ, USA, 2004.
- R. Wildes, “Iris recognition: an emerging biometric technology,” in Proceedings of the IEEE 85 (9) (1997) pp.1348 -1363.
- Jain, R. Bolle, S. Pankanti, “Biometrics: Personal Identification in Networked Society,” Kluwer Academic, 1999.
- International biometric groups consumer response to biometrics, http://www.ibgweb.com/reports/public/reports/facial scan perceptions.html(2002).
- Kong, D. Zhang, G. Lu, “A study of identical twins palmprints for personal authentication,” Pattern Recognition, 39 (11) (2006) pp 2149-2156.
- Han, H. Cheng, C. Lin, K. Fan, “Personal authentication using palmprint features,” Pattern Recognition, 36 (2) (2003) pp 371-381.
- N. Duta, A. Jain, K. Mardia, “Matching of palmprints,” Pattern Recognition Letters, 23 (4) (2002) pp 477-485.
- J. Funada, N. Ohta, M. Mizoguchi, T. Temma, K. Nakanishi, A. Murai, T. Sugiuchi, T. Wakabayashi, Y. Yamada, “Feature extraction method forpalmprint considering elimination of creases,” in Proceedings of International Conference on Pattern Recognition, Vol. 2, 1998, pp 1849-1854.
- D. Zhang, W. Shu, “Two novel characteristics in palmprint verification: datum point invariance and line feature matching,” Pattern Recognition,32 (4) (1999) pp 691-702.
- J. Chen, C. Zhang, G. Rong, “Palmprint recognition using creases,” in Proceedings of International Conference on Information Processing,2001, pp 234-237.
- X. Wu, K. Wang, D. Zhang, “Fuzzy directional element energy feature based palmprint identification,” in Proceedings of InternationalConference on Pattern Recognition, Vol. 1, 2002.
- K. Jain, J. Feng, “Latent palmprint matching,” IEEE Transaction on Pattern Analysis and Machine Intelligence.
- G. Lu, D. Zhang, K. Wang, “Palmprint recognition using eigenpalms features,” Pattern Recognition Letters 24 (9-10) (2003) pp 1463-1467.
- X. Wu, D. Zhang, W. K., “Fisherpalms based palmprint recognition,” Pattern Recognition Letters, 24 (2003) pp 2829-2938.
- L. Shang, D.-S. Huang, J.-X. Du, C.-H. Zheng, “Palmprint recognition using fastica algorithm and radial basis probabilistic neural network,”Neurocomputing 69 (13-15) (2006) pp 1782-1786.
|