2020
DOI: 10.1103/physrevfluids.5.054606
|View full text |Cite
|
Sign up to set email alerts
|

Modeling subgrid-scale forces by spatial artificial neural networks in large eddy simulation of turbulence

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

3
104
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 106 publications
(107 citation statements)
references
References 93 publications
3
104
0
Order By: Relevance
“…2019; Xie et al. 2019 a , b , c , 2020 a , b ) hidden layers, and Gamahara & Hattori (2017) showed that 100 neurons per hidden layer were sufficient for the accurate predictions of for a turbulent channel flow in a priori test. We also tested NN with three hidden layers, but more hidden layers than two did not further improve the performance both in a priori and a posteriori tests (see the Appendix).…”
Section: Numerical Detailsmentioning
confidence: 99%
See 1 more Smart Citation
“…2019; Xie et al. 2019 a , b , c , 2020 a , b ) hidden layers, and Gamahara & Hattori (2017) showed that 100 neurons per hidden layer were sufficient for the accurate predictions of for a turbulent channel flow in a priori test. We also tested NN with three hidden layers, but more hidden layers than two did not further improve the performance both in a priori and a posteriori tests (see the Appendix).…”
Section: Numerical Detailsmentioning
confidence: 99%
“…(2019) reported that using the filter size as well as the velocity gradient tensor as the input variables was beneficial to predict the SGS stresses for the flow having a filter size different from that of trained data. Xie, Wang & E (2020 a ) used an FCNN to predict the SGS force with the input of at multiple grid points, and this FCNN performed better than DSM for the prediction of energy spectrum. In the case of three-dimensional decaying isotropic turbulence, Wang et al.…”
Section: Introductionmentioning
confidence: 99%
“…The direct prediction of the closure terms instead of their modeling offers an alternative to the parameter estimation task in the previous section. Here, the unknown terms in Equation () are directly approximated by the ML algorithm—either as fluxes or as the forces themselves [5,21,79‐81,84,86,90]. Important to stress, however, is that the mapping from the coarse to the fine field is nonunique, and thus, each coarse field is associated with a distribution of corresponding closure terms.…”
Section: Examples Of Ml‐augmented Turbulence Modelingmentioning
confidence: 99%
“…Said inconsistency is caused by the interaction of the model predictions and numerical errors, which induces shifting data statistics and is amplified by error accumulation and self‐driving error growth at high wavenumbers. This can either be tackled by removal of this energy through a dissipative mechanism [86] (essentially an additional model term) or the projection of the closure term onto a stable basis with the desired properties [5].…”
Section: Examples Of Ml‐augmented Turbulence Modelingmentioning
confidence: 99%
See 1 more Smart Citation