Proceedings of the 2019 3rd International Conference on Biometric Engineering and Applications 2019
DOI: 10.1145/3345336.3345341
|View full text |Cite
|
Sign up to set email alerts
|

Empirical Evaluation of Texture-Based Print and Contact Lens Iris Presentation Attack Detection Methods

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

1
2
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 24 publications
1
2
0
Order By: Relevance
“…Experiments also show that the fusion method [12] outperforms the other two methods in both known and unknown settings. This finding agrees with [23].…”
Section: Comparison Of Open Source Methodssupporting
confidence: 92%
See 1 more Smart Citation
“…Experiments also show that the fusion method [12] outperforms the other two methods in both known and unknown settings. This finding agrees with [23].…”
Section: Comparison Of Open Source Methodssupporting
confidence: 92%
“…In [23], the authors compared five different PAD methods on four different datasets whose PAIs include printouts and patterned contact lenses. All five methods are traditional vision-based methods, where the feature extractors are adopted from previous PAD papers.…”
Section: Comparison Of Methods Grouped By Datasetsmentioning
confidence: 99%
“…Various iris PAD frameworks are grouped into hand-crafted and deep-learningbased methods. Hand-crafted features such as Local Binary Pattern (LBP) [6], [19], [28], Binarized statistical image features (BSIF) as well as their variations have been the research focal points until 2015 and made a remarkable contribution to iris PAD problems. After that, with the rapid development and application of deep learning in multiple domains, especially, computer vision field, neural-network-based PAD methods showed up out of nowhere.…”
Section: Iris Presentation Attack Detectionmentioning
confidence: 99%