2020 Advanced Computing and Communication Technologies for High Performance Applications (ACCTHPA) 2020
DOI: 10.1109/accthpa49271.2020.9213201
|View full text |Cite
|
Sign up to set email alerts
|

Comparison of deep neural networks for reference evapotranspiration prediction using minimal meteorological data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

1
7
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 12 publications
(8 citation statements)
references
References 36 publications
1
7
0
Order By: Relevance
“…The DNN learning mechanism involves the iterative execution of feed-forward and error back-propagation cycles until the optimal degree of precision has been attained (Saggi and Jain 2019;Sowmya et al 2020). DNN's generalization efficiency relies on the activation function used.…”
Section: Deep Neural Networkmentioning
confidence: 99%
See 4 more Smart Citations
“…The DNN learning mechanism involves the iterative execution of feed-forward and error back-propagation cycles until the optimal degree of precision has been attained (Saggi and Jain 2019;Sowmya et al 2020). DNN's generalization efficiency relies on the activation function used.…”
Section: Deep Neural Networkmentioning
confidence: 99%
“…The back-propagation error mechanism reinforces the model through a derivative method of error calculation and weight and bias updation. The weights of each neuron are adjusted with a loss function, and the loss function has to be minimized during the training process to increase model performance (Sowmya et al 2020;Vieira et al 2020). For regression, the widely used loss function is mean squared error (MSE) and it is computed using Eq.…”
Section: Deep Neural Networkmentioning
confidence: 99%
See 3 more Smart Citations