2021
DOI: 10.1016/j.ece.2021.01.011
|View full text |Cite
|
Sign up to set email alerts
|

Introducing students to research codes: A short course on solving partial differential equations in Python

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 15 publications
(3 citation statements)
references
References 29 publications
0
3
0
Order By: Relevance
“…or rapidly testing and optimising different designs using powerful computational resources (e.g., ASPEN (Ruiz-Ramos et al, 2017), CAD, FLUENT or COMSOL modelling, etc. (Inguva et al, 2020)).…”
Section: Nature Of Experiments In Teaching Laboratoriesmentioning
confidence: 99%
“…or rapidly testing and optimising different designs using powerful computational resources (e.g., ASPEN (Ruiz-Ramos et al, 2017), CAD, FLUENT or COMSOL modelling, etc. (Inguva et al, 2020)).…”
Section: Nature Of Experiments In Teaching Laboratoriesmentioning
confidence: 99%
“…Large language models (LLMs) based on the general framework of decoder-only autoregressive transformer models provide powerful tools for scientific exploration. The specific class of Generative Pretrained Transformer (GPT) models has received significant attention across many fields of inquiry, suggesting remarkable possibilities, especially for generative forward and inverse tasks. ,− However, our understanding of their behavior is still in its infancy, and issues remain with respect to fact recall and potential hallucination, which requires careful validation of their predictions and a thorough exploration of implications. There have been several recent developments that propose the use of LLM-type models, and more generally attention-based transformer architectures, to capture the behavior of physical systems, including materials. Other studies have suggested the broad applicability of transformer models in use cases ranging from protein folding, protein property prediction, mechanical field predictions, , as optimizers, materials design, ,,,, and as educational tools, among many others. The availability of large models is facilitated by recent developments such as the GPT-4 model, and also releases of open-source LLMs such as Llama-2 or Falcon. , Here, we discuss strategies for how LLMs can be improved to provide more accurate responses, especially in the context of materials analysis and design, and what mechanisms we can use to elicit more nuanced, detailed, and relevant outcomes, including interpretability.…”
Section: Introductionmentioning
confidence: 99%
“…Open-source software is readily available to the users compared to commercial software, which can be highly expensive. Moreover, open-source tools have demonstrated their scalability through their frequent use in industry and academia (Inguva et al 2021). Python is the most popular programing language in machine learning applications.…”
Section: Introductionmentioning
confidence: 99%