2018
DOI: 10.1007/978-3-319-77538-8_34
|View full text |Cite
|
Sign up to set email alerts
|

Evolvable Deep Features

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 13 publications
0
4
0
Order By: Relevance
“…One major limitation of adding the roipooling method would be the need to determinate the parameters (including the number and size of bounding boxes, and the size of kernels). Note that feature selection methods can improve the classification efficiency [59], however, the differences in selection methods can affect the priority of effective features for classification. Compression of the deep features results in a relatively high number of features, but keeps more information and maintains high stability.…”
Section: Discussionmentioning
confidence: 99%
“…One major limitation of adding the roipooling method would be the need to determinate the parameters (including the number and size of bounding boxes, and the size of kernels). Note that feature selection methods can improve the classification efficiency [59], however, the differences in selection methods can affect the priority of effective features for classification. Compression of the deep features results in a relatively high number of features, but keeps more information and maintains high stability.…”
Section: Discussionmentioning
confidence: 99%
“…We additionally mine batch-hard positive and negative examples and introduce respective positive-sample and negative-sample loss functions to further supplement the network in separating identity classes. [47] for machine learning classification. However, re-ID differs significantly from standard classification tasks in several ways: i) our model exploits metric learning rather than using a classification loss, ii) testing identities in re-ID do not appear in training, whereas all classes appear in training for regular classification tasks, iii) the evaluation metric is based on information retrieval rather than the classification accuracy.…”
Section: B Tripletsmentioning
confidence: 99%
“…Since the extracted feature channels from different layers are too large to fit our network effectively, we need to adopt deep feature selection. Recently, Nalepa et al [57] introduces a method for automatic deep feature selection, which combines a genetic algorithm with deep learning to reduce the number of feature channels. We think this method is novel, but it adds a higher level of complexity, similar to that encountered in the traditional genetic algorithm, which is inconvenient when we test the performance of our algorithm.…”
Section: A Cnn Featuresmentioning
confidence: 99%