2017 IEEE Energy Conversion Congress and Exposition (ECCE) 2017
DOI: 10.1109/ecce.2017.8096563
|View full text |Cite
|
Sign up to set email alerts
|

Training neural-network-based controller on distributed machine learning platform for power electronics systems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
16
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 17 publications
(16 citation statements)
references
References 8 publications
0
16
0
Order By: Relevance
“…The diagrammatic representation implies the conditional dependence relationship between the decision variables. The underlying relationship in the model is formulated in the Bayesian framework [1] and can be inferred in Design [79], [80], Control [4], [51], [81]- [85], Maintenance [86]- [97] Radial Control [121], [122], Maintenance [7], [74], [96], [118], [123]- [126] Relevance vector machine (RVM)…”
Section: Machine Learningmentioning
confidence: 99%
See 2 more Smart Citations
“…The diagrammatic representation implies the conditional dependence relationship between the decision variables. The underlying relationship in the model is formulated in the Bayesian framework [1] and can be inferred in Design [79], [80], Control [4], [51], [81]- [85], Maintenance [86]- [97] Radial Control [121], [122], Maintenance [7], [74], [96], [118], [123]- [126] Relevance vector machine (RVM)…”
Section: Machine Learningmentioning
confidence: 99%
“…A generic method is to start with a relatively small number of neurons and then gradually increase it according to the training error. For the activation function in the hidden layer, there are various options, including sigmoid [4], [51], [52], [83], radial basis function [50], [150], hyperbolic tangent function [106], [151], wavelet [46], [53], [84], [152], etc. It is worth mentioning that the wavelet activation function possesses the superior capabilities of convergence speed and generalization.…”
Section: B Neural Network-based Controllermentioning
confidence: 99%
See 1 more Smart Citation
“…A generic method is to start with a relatively small number of neurons and then gradually increase it according to the training error. For the activation function in the hidden layer, there are various options, including sigmoid [4,51,52,83], radial basis function [50,149], hyperbolic tangent function [105,150], wavelet [46,53,84,151], etc. It is worth mentioning that the wavelet activation function possesses the superior capabilities of convergence speed and generalization.…”
Section: Input Layer Hidden Layer Output Layermentioning
confidence: 99%
“…Of course, it can be completed with conventional optimization methods, e.g. PSO [51], recursive least square [98], Kalman filter [104], etc. Considering a large number of parameters in the neural network, these conventional optimization methods are generally inefficient.…”
Section: Input Layer Hidden Layer Output Layermentioning
confidence: 99%