2021 55th Asilomar Conference on Signals, Systems, and Computers 2021
DOI: 10.1109/ieeeconf53345.2021.9723382
|View full text |Cite
|
Sign up to set email alerts
|

Model selection and explainability in neural networks using a polytope interpolation framework

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
8
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 7 publications
(8 citation statements)
references
References 12 publications
0
8
0
Order By: Relevance
“…NNK has been shown to perform well in several machine learning tasks [15], image representation [16], and generalization estimation in neural networks [17]. Furthermore, NNK has also been used to understand convolutional neural networks (CNN) channel redundancy [18] and to propose an early stopping criterion for them [19].…”
Section: Non-negative Kernel (Nnk) Regression Graphsmentioning
confidence: 99%
“…NNK has been shown to perform well in several machine learning tasks [15], image representation [16], and generalization estimation in neural networks [17]. Furthermore, NNK has also been used to understand convolutional neural networks (CNN) channel redundancy [18] and to propose an early stopping criterion for them [19].…”
Section: Non-negative Kernel (Nnk) Regression Graphsmentioning
confidence: 99%
“…NNK has been shown to deliver good results in several machine learning tasks [15], image representation [16], and generalization estimation in neural networks [17]. Furthermore, NNK has also been used to understand convolutional neural networks (CNN) channel redundancy [18] and to propose an early stopping criterion for them [19].…”
Section: Non-negative Kernel (Nnk) Regression Graphsmentioning
confidence: 99%
“…For these reasons, we make use of non-negative kernel regression (NNK) [31] to define neighborhoods and graphs for our manifold analysis. Unlike KNN, which is a thresholding approximation, NNK can be viewed as a form of basis pursuit [41] and results in better neighborhood construction with improved and robust local estimation performance in various machine learning tasks [13], [32]. The key advantage of NNK is that it has a geometric interpretation for each neighborhood constructed.…”
Section: Supplementary Materials a Non-negative Kernel Regression (Nn...mentioning
confidence: 99%
“…This allows a distance to be computed between actions with different durations. We choose the NNK construction due to its robust performance in local estimation across different machine learning tasks [13]. The benefits of the NNK construction will be demonstrated through a comparison with k-NN DS-Graph constructions in Section IV.B.…”
Section: Introductionmentioning
confidence: 99%