2023
DOI: 10.1016/j.jmat.2023.05.001
|View full text |Cite
|
Sign up to set email alerts
|

Generative artificial intelligence and its applications in materials science: Current situation and future perspectives

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
31
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 91 publications
(31 citation statements)
references
References 35 publications
0
31
0
Order By: Relevance
“…However, the performance of ML models is significantly contingent upon the quality and diversity of input data. 47 Currently, the utilization of highly reliable experimental data contributes to the construction of more robust models. For classification tasks, the acquisition of labeled data is essential, whereas regression tasks necessitate the gathering of data pertaining to input features and corresponding continuous numerical values (target variables).…”
Section: ■ Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…However, the performance of ML models is significantly contingent upon the quality and diversity of input data. 47 Currently, the utilization of highly reliable experimental data contributes to the construction of more robust models. For classification tasks, the acquisition of labeled data is essential, whereas regression tasks necessitate the gathering of data pertaining to input features and corresponding continuous numerical values (target variables).…”
Section: ■ Introductionmentioning
confidence: 99%
“…The application of ML in materials science has demonstrated remarkable potential, holding promise of expediting the discovery, optimization, and design processes of novel materials. However, the performance of ML models is significantly contingent upon the quality and diversity of input data . Currently, the utilization of highly reliable experimental data contributes to the construction of more robust models.…”
Section: Introductionmentioning
confidence: 99%
“…Within the realm of AI, third, the GAI methods (e.g., generative adversarial networks (GANs), variational autoencoders) are other emerging tools with outstanding ability to learn complex patterns in the data and generate new contents, properties, or structures. [13] The GAI algorithms are particularly strong for inverse design of material systems. [13] Despite the significant capacity of the GAI methods in enhancing TENG systems in various aspects, their substantive application in this area is conspicuous by its absence.…”
Section: Introductionmentioning
confidence: 99%
“…[13] The GAI algorithms are particularly strong for inverse design of material systems. [13] Despite the significant capacity of the GAI methods in enhancing TENG systems in various aspects, their substantive application in this area is conspicuous by its absence. The major challenge with the implementation of AI and GAI methods is that they often require a large amount of data for calibration.…”
Section: Introductionmentioning
confidence: 99%
“…Potential energy surface approaches based on empirical interatomic potentials are fast, however, often inaccurate. Recently, machine learning (ML) techniques with density functional theory (DFT) results as inputs yield fast simulations and accurate results. One popular choice is the descriptor-based neural network (NN) approaches ,, , by regarding the energy as a function of bond lengths, bond angles, or related symmetry functions. , ,, However, there are three main issues: (1) large amount of symmetry functions, predetermined subjectively; (2) time-consuming and poor accuracy in force fitting, based on gradient of energy; (3) no objective distribution criteria in the training data generation.…”
mentioning
confidence: 99%