2015
DOI: 10.1007/s11042-014-2424-1
|View full text |Cite
|
Sign up to set email alerts
|

Landmark recognition with compact BoW histogram and ensemble ELM

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
30
0

Year Published

2016
2016
2020
2020

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 74 publications
(30 citation statements)
references
References 39 publications
0
30
0
Order By: Relevance
“…The former consists of converting the query IPs into a bag of k M A N U S C R I P T clusters (obtained from the reference dataset using a clustering algorithm (e.g. k-means)) and then representing the image (or video) as a histogram of the occurrence of each cluster [25]. The latter consists of determining for each query IP the closest IP from the reference set.…”
Section: Ips Matching Via Sparse Representationmentioning
confidence: 99%
See 1 more Smart Citation
“…The former consists of converting the query IPs into a bag of k M A N U S C R I P T clusters (obtained from the reference dataset using a clustering algorithm (e.g. k-means)) and then representing the image (or video) as a histogram of the occurrence of each cluster [25]. The latter consists of determining for each query IP the closest IP from the reference set.…”
Section: Ips Matching Via Sparse Representationmentioning
confidence: 99%
“…Re-identification rate (%) Using all test dataset [10] 19.2 SURF-1NN [42] 22.5 Our approach (only SURF-SR) 27 Our approach (only Cuboids-SR) 25.5 Our approach (Fusion): α " 0. 5 28.5 Our approach (Fusion): α " 0.3 31 Using only the 200 common people in evaluation Our approach 31 [48] 28.9…”
Section: Approachmentioning
confidence: 99%
“…With the random assignments on the input weights, the output weights of ELMs can be directly determined by solving a linear least square problem, which bears similarities to designing the output weights of a radial basis function (RBF) network [37], [38] or neural networks with other activation functions [39], [40]. Besides, some other typical work on ELMs include using evolutionary strategy to optimize the input weights such that a compact ELM can be obtained [41], ELMs with sparse representation [42], integrating fuzzy logic with ELMs to improve the approximation performance [43], employing an ensemble of classifiers with the ELM as the base for performance improvement [44], applications of ELM for effective recognition of landmarks [45], [46], and insightful interpretation of ELMs from the perspective of random neurons, random features, and kernels [47].…”
Section: Introductionmentioning
confidence: 99%
“…In recent decades, delayed neural networks have obtained much more considerable attention because they often exist in a lot of areas, such as signal processing, model identification and optimization problem [1][2][3][4][5]. Since that time delays is frequently encountered, the problem of stability analysis of time delayed neural networks is an important point in the field of neural networks recently.…”
Section: Introductionmentioning
confidence: 99%