2020 IEEE International Conference on Image Processing (ICIP) 2020
DOI: 10.1109/icip40778.2020.9191232
|View full text |Cite
|
Sign up to set email alerts
|

Efficient Graph Construction For Image Representation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
5
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
1
1

Relationship

3
4

Authors

Journals

citations
Cited by 9 publications
(5 citation statements)
references
References 19 publications
0
5
0
Order By: Relevance
“…Further, NNK neighborhoods are stable for large enough K, and their size is indicative of the ID of the manifold the data belongs to [15]. This method has been shown to provide advantages for semi-supervised learning, image representation [16], and generalization estimation [10]. We first proposed a channel-wise approach for NNK graphs in [17], where information from multiple channels was used to estimate generalization and perform early stopping without requiring a validation set.…”
Section: Introductionmentioning
confidence: 99%
“…Further, NNK neighborhoods are stable for large enough K, and their size is indicative of the ID of the manifold the data belongs to [15]. This method has been shown to provide advantages for semi-supervised learning, image representation [16], and generalization estimation [10]. We first proposed a channel-wise approach for NNK graphs in [17], where information from multiple channels was used to estimate generalization and perform early stopping without requiring a validation set.…”
Section: Introductionmentioning
confidence: 99%
“…NNK has been shown to perform well in several machine learning tasks [15], image representation [16], and generalization estimation in neural networks [17]. Furthermore, NNK has also been used to understand convolutional neural networks (CNN) channel redundancy [18] and to propose an early stopping criterion for them [19].…”
Section: Non-negative Kernel (Nnk) Regression Graphsmentioning
confidence: 99%
“…NNK has been shown to deliver good results in several machine learning tasks [15], image representation [16], and generalization estimation in neural networks [17]. Furthermore, NNK has also been used to understand convolutional neural networks (CNN) channel redundancy [18] and to propose an early stopping criterion for them [19].…”
Section: Non-negative Kernel (Nnk) Regression Graphsmentioning
confidence: 99%