2019
DOI: 10.48550/arxiv.1911.09257
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

DeepLABNet: End-to-end Learning of Deep Radial Basis Networks with Fully Learnable Basis Functions

Abstract: From fully connected neural networks to convolutional neural networks, the learned parameters within a neural network have been primarily relegated to the linear parameters (e.g., convolutional filters). The non-linear functions (e.g., activation functions) have largely remained, with few exceptions in recent years, parameterless, static throughout training, and seen limited variation in design. Largely ignored by the deep learning community, radial basis function (RBF) networks provide an interesting mechanis… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 21 publications
0
4
0
Order By: Relevance
“…Figure 4 illustrates the layout of such a DNN with fully connected layers. State-ofthe-art DNNs incorporate various other types of layers, including RBFN [178], SVR [179], and TSK fuzzy [180,181] layers. DNNs can be endowed with the ability to handle time series data by incorporating long short-term memory (LSTM) or gated recurrent unit (GRU) layers [176,182].…”
Section: Deep Neural Networkmentioning
confidence: 99%
“…Figure 4 illustrates the layout of such a DNN with fully connected layers. State-ofthe-art DNNs incorporate various other types of layers, including RBFN [178], SVR [179], and TSK fuzzy [180,181] layers. DNNs can be endowed with the ability to handle time series data by incorporating long short-term memory (LSTM) or gated recurrent unit (GRU) layers [176,182].…”
Section: Deep Neural Networkmentioning
confidence: 99%
“…4 illustrates the layout of such a DNN with fully connected layers. State-of-the-art DNNs incorporate various other types of layers, including RBFN [175], SVR [176], and TSK fuzzy [177,178] layers. DNNs can be endowed with the ability to handle time series data by incorporating of long short-term memory (LSTM) or gated recurrent unit (GRU) layers [173,179].…”
Section: Deep Neural Networkmentioning
confidence: 99%
“…Activation functions with curvature are especially common in ANNs with only a couple of layers. For example, activation functions in radial basis function (RBF) networks [936][937][938][939] , which are efficient universal approximators, are often Gaussians, multiquadratics, inverse multiquadratics, or square-based RBFs 940 .…”
Section: Nonlinear Activationmentioning
confidence: 99%