2018
DOI: 10.1109/tip.2017.2775060
|View full text |Cite
|
Sign up to set email alerts
|

Latent Constrained Correlation Filter

Abstract: Correlation filters are special classifiers designed for shift-invariant object recognition, which are robust to pattern distortions. The recent literature shows that combining a set of sub-filters trained based on a single or a small group of images obtains the best performance. The idea is equivalent to estimating variable distribution based on the data sampling (bagging), which can be interpreted as finding solutions (variable distribution approximation) directly from sampled data space. However, this metho… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
27
0

Year Published

2018
2018
2020
2020

Publication Types

Select...
8

Relationship

3
5

Authors

Journals

citations
Cited by 44 publications
(27 citation statements)
references
References 39 publications
0
27
0
Order By: Relevance
“…The extensive experiments show that GCNs significantly improved baselines, resulting in the state-of-theart performance over several benchmarks. In the future, more GCNs architectures (larger ones) will be tested on other tasks, such as object tracking, detection and segmentation [41], [42], [43], [44]. …”
Section: E Experiments On Food-101 Datasetmentioning
confidence: 99%
“…The extensive experiments show that GCNs significantly improved baselines, resulting in the state-of-theart performance over several benchmarks. In the future, more GCNs architectures (larger ones) will be tested on other tasks, such as object tracking, detection and segmentation [41], [42], [43], [44]. …”
Section: E Experiments On Food-101 Datasetmentioning
confidence: 99%
“…A group of metrics including accuracy, recall and precision were employed to quantify the classification accuracy. The definition of accuracy, recall, and precision are described as Equations (2)–(4), where true positive (TP), true negative (TN), false positive (FP), and false negative (FN) are labeled in Figure 4 [ 38 ]. …”
Section: Resultsmentioning
confidence: 99%
“…Since Hinton [ 35 , 36 ] proposed a greedy layer-wise pre-training algorithm to initialize the weights of deep architectures, artificial neural networks have been revived. Deep neural networks have become a new popular topic and advanced in image classification, object tracking [ 37 ] and recognition [ 38 ], gesture recognition [ 39 ], action recognition [ 40 ], defect inspection [ 41 , 42 , 43 , 44 ], voice recognition, natural language understanding, etc. Popular deep learning frameworks include stacked autoencoders, convolutional neural networks, and restricted Boltzmann machine.…”
Section: Proposed Methodsmentioning
confidence: 99%
“…Algorithm 2 Dijkstra-distance Based Correlation Filters 1: Set t = 0, ε best = +∞, η = 0.7 (suggested in [35]…”
Section: Methodsmentioning
confidence: 99%