2019
DOI: 10.3389/fams.2019.00046
|View full text |Cite
|
Sign up to set email alerts
|

Deep Net Tree Structure for Balance of Capacity and Approximation Ability

Abstract: Deep learning has been successfully used in various applications including image classification, natural language processing and game theory. The heart of deep learning is to adopt deep neural networks (deep nets for short) with certain structures to build up the estimator. Depth and structure of deep nets are two crucial factors in promoting the development of deep learning. In this paper, we propose a novel tree structure to equip deep nets to compensate the capacity drawback of deep fully connected neural n… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
4
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 34 publications
0
4
0
Order By: Relevance
“…x i ) − y i ) 2over a compact subset H of C(X ), which can be verified with the same proof as that of[4, Theorem 2]. Lemma 11.…”
mentioning
confidence: 53%
See 2 more Smart Citations
“…x i ) − y i ) 2over a compact subset H of C(X ), which can be verified with the same proof as that of[4, Theorem 2]. Lemma 11.…”
mentioning
confidence: 53%
“…All the above estimates on approximation by deep neural networks, structured or fully connected, are stated in terms of the smoothness of the approximated function. Approximating radial functions by fully-connected neural networks was studied in [20,3,4], while representing functions with variables having given compositional structures by fully-connected networks designed based on the known compositional structures was considered in [22,27].…”
Section: Generalization Analysis Of Dcnnsmentioning
confidence: 99%
See 1 more Smart Citation
“…where For example, taking the special form of Toeplitz-type weight matrices leads to the deep convolutional nets [47], [48], [49], full matrices correspond to deep fully connected nets [12], and tree-type sparse matrices imply deep nets with tree structures [5], [6]. In this paper, we do not focus on the structure selection of deep nets, but rather on the existence of some deep net structure for realization of the sampling theorem established in Theorem 1.…”
Section: A Deep Relu Netsmentioning
confidence: 99%