Study and Development of Iris Segmentation and Normalisation Techniques  

Abstract

There have been several implementations of security systems using biometric, especially for identification and verification cases. The term "biometrics" is derived from the Greek words bio (life) and metric (to measure). In other words, bio means living creature and metrics means the ability to measure an object quantitatively. An example of pattern used in biometric is the iris pattern in human eye. The iris pattern has been proved unique for each person. The iris recognition system consists of an automatic segmentation system that is based on the Hough transform, and is able to localize the circular iris and pupil region, occluding eyelids and eyelashes, and reflections. The extracted iris region was then normalized into a rectangular block with constant dimensions to account for imaging inconsistencies. Finally, the phase data from 1D Log-Gabor filters were extracted and quantized to four levels to encode the unique pattern of the iris into a bit-wise biometric template using Daugman’s rubber-sheet model. The Hamming distance was employed for classification of iris templates, and two templates were found to match if a test of statistical independence was failed. 

Authors and Affiliations

Anshu Parasha , Yogita Gulati,

Keywords

Related Articles

Design & Implementation of 64 bit ALU for Instruction Set Architecture & Comparison between Speed/Power Consumption on FPGA  

In the present paper design of 64 bit ALU is presented. Arithmetic Logical Unit is the part of Microprocessor. All the arithmetic & logical functions are performed inside the ALU. So ALU is the heart of the micropr...

DATA SHARING IN THE CLOUD USING DISTRIBUTED ACCOUNTABILITY

Now a day’s Cloud Computing is the rapid growing technology. Now most of the persons are using Cloud Computing technology .Cloud computing enables highly scalable services to be easily consumed over the Internet on an as...

A Review of Soft Computing Advance in Genetics & Laser Biomedical Instrumentation  

Soft computing refers to a collection of computational techniques in computer science, artificial intelligence, machine learning, medical instrumentation which attempt to study, model and analyze very complex p...

Secure Policy Based Data Sharing for Dynamic Groups in the Cloud  

Major problem in public clouds is how to share documents based on fine-grained attribute based access control policies, sharing data in a dynamic groups while preserving data and identity privacy from an un trusted...

Design of Multilayer High Impedance Surface for Antenna Applications

In the recent years High Impedance structures are drawing lot of interest in electromagnetic and antenna community. Due to compact sizes of electronic devices, required radiating element to be placed closed vicinity...

Download PDF file
  • EP ID EP157034
  • DOI -
  • Views 101
  • Downloads 0

How To Cite

Anshu Parasha, Yogita Gulati, (2012). Study and Development of Iris Segmentation and Normalisation Techniques  . International Journal of Advanced Research in Computer Engineering & Technology(IJARCET), 1(5), 237-240. https://europub.co.uk/articles/-A-157034