Active Matter 2017
DOI: 10.7551/mitpress/11236.003.0009
|View full text |Cite
|
Sign up to set email alerts
|

Interview Between Markus J. Buehler and Tomás Saraceno

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 0 publications
0
2
0
Order By: Relevance
“…LLMs are particularly intriguing due to their ability to reason and to apply knowledge recall, synthesis, and translation across multitudes of domains from logic to mathematics to simulation data. [ 31,65–69 ]…”
Section: Discussionmentioning
confidence: 99%
“…LLMs are particularly intriguing due to their ability to reason and to apply knowledge recall, synthesis, and translation across multitudes of domains from logic to mathematics to simulation data. [ 31,65–69 ]…”
Section: Discussionmentioning
confidence: 99%
“…Recently, transformer architectures, which exhibit outstanding performance in natural language processing (NLP), have gained attention for their performance . Transformers can discern importance through a self-attention mechanism and achieve superior performance by gaining a deeper understanding of the context. , Furthermore, these models have facilitated agent-based modeling, aiding in the automation of problem-solving in the field of materials. These architectures require a pretraining process during which they learn general characteristics from extensive data sets. When fine-tuned for a specific task, they are known to perform better, even with limited data .…”
Section: Introductionmentioning
confidence: 99%