Loading...

Messages

Proposals

Stuck in your homework and missing deadline?

Get Urgent Help In Your Essays, Assignments, Homeworks, Dissertation, Thesis Or Coursework Writing

100% Plagiarism Free Writing - Free Turnitin Report - Professional And Experienced Writers - 24/7 Online Support

Assignment on Periocular recognition

Category: Arts & Education Paper Type: Assignment Writing Reference: APA Words: 1000

The feasibility of using periocular region as stand-alone trait for biometric recognition was studied by Park et al. [35] in 2009. By combining local features, specifically LBP, HOG and SIFT, they reported promising results that established the utility of periocular as a biometric trait. Two years later, Park et al. [2] performed extensive experiments to investigate the effects of different factors on the performance of periocular recognition. They found that eyebrow contains the most discriminating feature and variations of pose, expression, template aging and occlusions are some of the degradation factors.

The researchers have been motivated by the above-mentioned explorative works, which can be used for the exploitation of more hand-crafted techniques. A variety of works in the literature have focused on LBP along with its variations [7], [8]. However, there have been works, which also included a few other features other than LBP. The authors of [31] have chosen the option to employ LCH, which has the capability to report the best possible accuracy, which is relevant to the database of FRGC. It was suggested by Ambika et al. [3] that information should be shaped, and texture should be fused, which is obtained after the extraction of LBP (Local Binary Pattern Variance), as well as, getting moments of Zernike from the images for the reduction of expression variation and pose effects. The local contrast information is extracted by LBPV so that features of rotation variant can be achieved, whereas they are made perfect with the help of Zernike moments, which provided a description for the classification of shape. The eye rotation effect was focused by Cho et al. [27], and it was claimed that if input images’ pixels are mapped from the Cartesian coordinate and then going to the polar coordinate and eye rotation effect can be reduced before the application of a feature descriptor. The LPQ combination along with the Gabor magnitude descriptor was employed by the Gangwar and Joshi [33] for the extraction of the feature so that effectiveness linked with the phase descriptors can be demonstrated in periocular biometrics. The effectiveness of phase information was also claimed by Bakshi et al. [9], and they also gave a proposal of a global feature descriptor, which is known as Phase Intensive Global Pattern (PIGP). The neighbor pixels intensity and variation make PIGP dependent on different phases. An idea was invented by [41] to extend their work, and this idea helped to develop a local descriptor which used the phase information mentioned with key points related to images, rather using a global descriptor named as Phase Intensive Local Pattern (PILP).

In 2015, a Periocular Probabilistic Deformation Model (PPDM) was proposed by Smereka et al. [1], which came up with effective modeling to be used for potential deformation, which is there between the varieties of periocular images. The captured deformation has inference with the use of a correlation filter, and these were used to match the periocular pairs. This group of researchers continued their research, and they were able to make improvements in their basic model in 2016, with the help of discriminative patch regions to get accountable matching for better results [11]. On a variety of datasets, a great performance was shown by the two methods. The patch-based matching scheme is used by both methods; that’s why there resistance for scale variation is less than expected, whereas patch correspondence can be violated.

 

Periocular Dataset: 1) The database of the University of Beira Interior Periocular (UBIPr) [10] is containing some unconstrained images, which have used a visible spectrum, and this has variation in its scales, along with other elements such as pigmentation, eyeball movement, distance, occlusions, illumination, and head pose. There was variation in the distance of the camera between 4m-8m, when the interval was 1m, whereas there was also variation noticed in the resolutions. The sRGB format was used to store images, where the male subjects were 54.4%, whereas the percentage for females was 45.6%.

2) The Face Recognition Grand Challenge (FRGC) [31]: The National Institute of Standards and Technology (NIST) have released this database. The still images with the visible spectrum are contained in it, and these images are captured with the help of several recording sessions with a variety of illumination and expressions. There are controlled scenarios, which are also used for capturing the images, and these controls are neutral expression, distance from camera in a fixed length, and lighting conditions. Ethnicity and gender are also included in the dataset.

3) The Face and Ocular Challenge Series (FOCS) [34]: NIST is again the institute, which has released this dataset, and there are ocular videos and images in it, which are acquired by using the essence of NIR imaging spectrum. The degrees of illumination, specular reflections, and occlusions are found in images. It is observed that a degraded quality has been observed by a larger variety of images and it happened because of blur and sensor noise.

4) VISOB [43]: It is one of the competition datasets, which has ocular images and capturing of these images was done under conditions such as office light, dim light, as well as, daylight. Three mobile phones were used to capture the images such as Samsung Galaxy Note 4, Oppo N1, and iPhone 5s. The condition to take these images was unconstrained, which showed the essence of illumination, off gaze angles, makeups, blur, and occlusion.

5) UBIRIS.v2: The database provided by the University of Beira Interior Iris (UBIRIS) is available without any charges, and this is linked with a visible spectrum where uncontrolled information is uncontrolled in a UBIRIS.v2   environment.  The non-constrained conditions were used by the UBIRIS.v2. The noisy eye images were taken on purpose so that non-cooperative images capturing conditions are simulated in the environment of the real world. There are various variations associated with the eye images such as specular reflections, partial iris reflections, and poor focus of iris, blur motion, glare, and eyelash and eyelids obstruction. The iris recognition and segmentation schemes are being evaluated in NICE 1 and 2 along with the international competition [103] [20]

Our Top Online Essay Writers.

Discuss your homework for free! Start chat

Best Coursework Help

ONLINE

Best Coursework Help

1554 Orders Completed

Assignment Helper

ONLINE

Assignment Helper

21 Orders Completed

Financial Analyst

ONLINE

Financial Analyst

1596 Orders Completed