2023
DOI: 10.1088/1402-4896/acf9ea
|View full text |Cite
|
Sign up to set email alerts
|

The kernel-balanced equation for deep neural networks

Kenichi Nakazato

Abstract: Deep neural networks have shown many fruitful applications in this decade. A network can get the generalized function through training with a finite dataset. The degree of generalization is a realization of the proximity scale in the data space. Specifically, the scale is not clear if the dataset is complicated. Here we consider a network for the distribution estimation of the dataset. We show the estimation is unstable and the instability depends on the data density and training duration. We derive the kernel… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 30 publications
0
1
0
Order By: Relevance
“…Neural networks consist of artificial neurons connected with each other and work as an indirect memory, which learns the structural properties in the data space in an adaptive manner as a whole [6]. In other words, it is a typical complex adaptive system, which realizes an adaptive function with the distributed elements [7,8].…”
Section: Introductionmentioning
confidence: 99%
“…Neural networks consist of artificial neurons connected with each other and work as an indirect memory, which learns the structural properties in the data space in an adaptive manner as a whole [6]. In other words, it is a typical complex adaptive system, which realizes an adaptive function with the distributed elements [7,8].…”
Section: Introductionmentioning
confidence: 99%