2018
DOI: 10.1016/j.advwatres.2018.01.009
|View full text |Cite
|
Sign up to set email alerts
|

Water and sediment temperature dynamics in shallow tidal environments: The role of the heat flux at the sediment-water interface

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
29
0
1

Year Published

2019
2019
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 27 publications
(34 citation statements)
references
References 37 publications
1
29
0
1
Order By: Relevance
“…[34,35] first derive analytical expression of the energy gradient as infinite tensor networks, and then contract these networks approximately to obtain approximated gradient. Thus, the two approaches perform differentiate the approximation and approximate the derivative respectively [44]. Other than the general recommendation of Ref.…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations
“…[34,35] first derive analytical expression of the energy gradient as infinite tensor networks, and then contract these networks approximately to obtain approximated gradient. Thus, the two approaches perform differentiate the approximation and approximate the derivative respectively [44]. Other than the general recommendation of Ref.…”
Section: Discussionmentioning
confidence: 99%
“…The performance of automatic differentiation has a general theoretical guarantee, which does not exceed the algorithmic complexity of the original program [41,42]. Automatic differentiation is the computational engine of modern deep learning applications [43,44]. Moreover, automatic differentiation also finds applications in quantum optimal control [45] and quantum chemistry calculations such as computing forces [46] and optimizing basis parameters [47].…”
Section: A Automatic Differentiationmentioning
confidence: 99%
See 2 more Smart Citations
“…(3) we have exploited the reversibility of the coordinate transformation. Interestingly, the momentum transformation has the form of a vector-Jacobian product, which is commonly implemented for the reverse mode automatic differentiation [45]. This is intuitively understandable since the momentum is covariant under the coordinate transformation.…”
Section: Neural Point Transformationsmentioning
confidence: 99%