2021
DOI: 10.48550/arxiv.2107.12972
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Channel-Wise Early Stopping without a Validation Set via NNK Polytope Interpolation

Abstract: State-of-the-art neural network architectures continue to scale in size and deliver impressive generalization results, although this comes at the expense of limited interpretability. In particular, a key challenge is to determine when to stop training the model, as this has a significant impact on generalization. Convolutional neural networks (ConvNets) comprise highdimensional feature spaces formed by the aggregation of multiple channels, where analyzing intermediate data representations and the model's evolu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 20 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?