2022
DOI: 10.1016/j.neunet.2022.03.019
|View full text |Cite
|
Sign up to set email alerts
|

Improving generalization of deep neural networks by leveraging margin distribution

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
8
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 7 publications
(8 citation statements)
references
References 24 publications
0
8
0
Order By: Relevance
“…Lyu e t al . [ 32 ] reformulate deep forest as an additive model boosting new features by optimizing the margin distribution layer by layer. They first give a theoretical explanation of the success of cascade structure from the perspective of margin theory.…”
Section: Deep Forestmentioning
confidence: 99%
See 4 more Smart Citations
“…Lyu e t al . [ 32 ] reformulate deep forest as an additive model boosting new features by optimizing the margin distribution layer by layer. They first give a theoretical explanation of the success of cascade structure from the perspective of margin theory.…”
Section: Deep Forestmentioning
confidence: 99%
“…Lots of work has recently improved the two components of deep forests [ 29,31,32,37 ] and expanded the tree‐based deep models to some specific settings, such as multi‐label learning [ 26 ] , multi‐instance learning [ 38 ] , multi‐modal learning [ 39 ] , semi‐supervised learning [ 27 ] , and crowdsourcing aggregation [ 40 ] . It would be interesting to explore the possibility of exploiting deep forests for rehearsal [ 41 ] .…”
Section: Deep Forestmentioning
confidence: 99%
See 3 more Smart Citations