2021
DOI: 10.48550/arxiv.2105.02375
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Geometric Analysis of Neural Collapse with Unconstrained Features

Zhihui Zhu,
Tianyu Ding,
Jinxin Zhou
et al.

Abstract: We provide the first global optimization landscape analysis of Neural Collapse -an intriguing empirical phenomenon that arises in the last-layer classifiers and features of neural networks during the terminal phase of training. As recently reported in [1], this phenomenon implies that (i) the class means and the last-layer classifiers all collapse to the vertices of a Simplex Equiangular Tight Frame (ETF) up to scaling, and (ii) cross-example within-class variability of last-layer activations collapses to zero… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
16
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(18 citation statements)
references
References 63 publications
2
16
0
Order By: Relevance
“…The most relevant study is a concurrent work [50], which provides a landscape analysis of the regularized unconstrained feature model. Zhu et al [50] turns the feature norm constraint in Fang et al [7] into feature norm regularization and still preserves the neural collapse global optimum.…”
Section: Relationship With Other Results On Neural Collapsementioning
confidence: 99%
See 4 more Smart Citations
“…The most relevant study is a concurrent work [50], which provides a landscape analysis of the regularized unconstrained feature model. Zhu et al [50] turns the feature norm constraint in Fang et al [7] into feature norm regularization and still preserves the neural collapse global optimum.…”
Section: Relationship With Other Results On Neural Collapsementioning
confidence: 99%
“…The most relevant study is a concurrent work [50], which provides a landscape analysis of the regularized unconstrained feature model. Zhu et al [50] turns the feature norm constraint in Fang et al [7] into feature norm regularization and still preserves the neural collapse global optimum. At the same time, it shows that the modified regularized objective shares a benign landscape, where all the critical points are strict saddles except for the global one.…”
Section: Relationship With Other Results On Neural Collapsementioning
confidence: 99%
See 3 more Smart Citations