2004
DOI: 10.1088/0957-0233/16/1/041
|View full text |Cite
|
Sign up to set email alerts
|

Elements of informatics for combinatorial solid-state materials science

Abstract: The main purpose of using combinatorial techniques for materials science studies is to achieve higher experimental throughput than what is possible when samples are synthesized and characterized one at a time. The instrumentation needed for performing high-throughput synthesis and characterization has seen rapid development in recent years. The software tools needed to connect all parts of the materials development process are still largely lacking. In this paper we discuss the requirements of a combinatorial … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2005
2005
2019
2019

Publication Types

Select...
10

Relationship

0
10

Authors

Journals

citations
Cited by 14 publications
(15 citation statements)
references
References 14 publications
0
15
0
Order By: Relevance
“…108 The creation and deployment of HTE workflows necessarily leads to a bottleneck centered around the need to interpret large (sometimes thousands) of materials data correlated in composition, processing, and microstructure from a single experiment. 109,110 By the early 2000s a single HTE sample containing hundreds of individual samples could be made and measured for a range of characteristics within a week, but the subsequent knowledge extraction of composition, structure, properties of interest, and figure of merit (FOM) often took weeks to months. There were several early international efforts to standardize data formats and create data analysis and interpretation tools for large scale data sets.…”
Section: Combinatorial Libraries and High Throughput Experimentationmentioning
confidence: 99%
“…108 The creation and deployment of HTE workflows necessarily leads to a bottleneck centered around the need to interpret large (sometimes thousands) of materials data correlated in composition, processing, and microstructure from a single experiment. 109,110 By the early 2000s a single HTE sample containing hundreds of individual samples could be made and measured for a range of characteristics within a week, but the subsequent knowledge extraction of composition, structure, properties of interest, and figure of merit (FOM) often took weeks to months. There were several early international efforts to standardize data formats and create data analysis and interpretation tools for large scale data sets.…”
Section: Combinatorial Libraries and High Throughput Experimentationmentioning
confidence: 99%
“…The issues include managing work flows in experiments, tracking multivariate measurements and storing the data, and the ability to query and retrieve information from such databases. A vast array of literature on this subject (72,73) …”
Section: Informatics and Combinatorial Experimentationmentioning
confidence: 99%
“…4 These problems are being solved on two levels: developing standards for representing materials data, and building database tools for rapidly processing, storing, and distributing large data sets. 5 There are numerous projects underway to create standards for describing materials and characterization data using XML, 6 but none of them has gained the widespread acceptance necessary for successful general distribution of XMLencoded materials characterization data sets. This is clearly a field that urgently needs further development.…”
Section: Data Management and Data Miningmentioning
confidence: 99%