2022
DOI: 10.1111/rssb.12496
|View full text |Cite
|
Sign up to set email alerts
|

A Kernel-Expanded Stochastic Neural Network

Abstract: The deep neural network suffers from many fundamental issues in machine learning. For example, it often gets trapped into a local minimum in training, and its prediction uncertainty is hard to be assessed. To address these issues, we propose the so-called kernel-expanded stochastic neural network (K-StoNet) model, which incorporates support vector regression as the first hidden layer and reformulates the neural network as a latent variable model. The former maps the input vector into an infinite dimensional fe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 40 publications
0
2
0
Order By: Relevance
“…The potential of stochastic activation layers for UQ was briefly discussed in [69], but the context of this work was classification tasks, where the predictive uncertainty could already be fully characterized by well-calibrated categorical distributions without using any latent noise. More recently, [70] formulated DNNs as latent variable models and included kernel maps in the input layer to avoid feature collinearity. Although the proposed model was capable of UQ, the more challenging problem of DR was not studied.…”
Section: Related Workmentioning
confidence: 99%
“…The potential of stochastic activation layers for UQ was briefly discussed in [69], but the context of this work was classification tasks, where the predictive uncertainty could already be fully characterized by well-calibrated categorical distributions without using any latent noise. More recently, [70] formulated DNNs as latent variable models and included kernel maps in the input layer to avoid feature collinearity. Although the proposed model was capable of UQ, the more challenging problem of DR was not studied.…”
Section: Related Workmentioning
confidence: 99%
“…The potential of stochastic activation layers for UQ was briefly discussed in Lee et al (2019) , but the context of this work was classification tasks, where the predictive uncertainty could already be fully characterized by well-calibrated categorical distributions without using any latent noise. More recently, Sun and Liang (2022) formulated DNNs as latent variable models and included kernel maps in the input layer to avoid feature collinearity. Although the proposed model is capable of UQ, the more challenging problem of DR has not been studied.…”
Section: Introductionmentioning
confidence: 99%