2016 International Joint Conference on Neural Networks (IJCNN) 2016
DOI: 10.1109/ijcnn.2016.7727608
|View full text |Cite
|
Sign up to set email alerts
|

A null space based one class kernel Fisher discriminant

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
11
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(11 citation statements)
references
References 20 publications
0
11
0
Order By: Relevance
“…Since novel examples do not exist prior to test time, training is carried out using oneclass learning principles. In the previous works [4], [11], the performance of novelty detection has been assessed based on different classes of the ImageNet and the Caltech 256 datasets. Since all CNNs used in our work have been trained using the ImageNet dataset, we use the Caltech 256 dataset to evaluate the performance of one-class novelty detection.…”
Section: B Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…Since novel examples do not exist prior to test time, training is carried out using oneclass learning principles. In the previous works [4], [11], the performance of novelty detection has been assessed based on different classes of the ImageNet and the Caltech 256 datasets. Since all CNNs used in our work have been trained using the ImageNet dataset, we use the Caltech 256 dataset to evaluate the performance of one-class novelty detection.…”
Section: B Resultsmentioning
confidence: 99%
“…Dev.) One Class SVM [43] 0.606 (0.003) KNFST [4] 0.575 (0.004) Oc-KNFD [11] 0.619 (0.003) Autoencoder [15] 0.532(0.003) OCNN AlexNet [5] 0.907 (0.029) OCNN VGG16 [5] 0 classes. Therefore, in this particular case, ImageNet samples are a good representative of novel object classes present in Caltech 256.…”
Section: B Resultsmentioning
confidence: 99%
“…Another drawback is that the method is based on the assumption that the target population differs from the outlier population in terms of their respective densities which might not hold for real-world problems in general. A later study [45] tries to address these shortcomings via a null-space variant of the method in [39], [52]. In order to overcome the limitation of the availability of outlier samples, it is proposed to separate the target class from the origin of the kernel feature space, which serves as an artificial outlier sample.…”
Section: Related Workmentioning
confidence: 99%
“…Nevertheless, the high computational cost associated with these methods can be considered as a bottleneck in their usage. For instance, the one-class variants of kernel discriminant analysis [38], [45], [39], [46] often require computationally intensive eigendecompositions of dense matrices.…”
Section: Introductionmentioning
confidence: 99%
“…from multi-class discriminant analysis approaches dealing with this small sample size problem [11] indicates that null space directions contain high discrimination power [12], [13], [14]. Interestingly, one-class discrimination approaches based on null space analysis have been also recently proposed [15], [16]. However, the latter ones are in fact designed by following a multi-class setting and cannot be directly extended for classspecific discrimination.…”
Section: Introductionmentioning
confidence: 99%