2020
DOI: 10.1007/s00216-020-02594-9
|View full text |Cite
|
Sign up to set email alerts
|

Data-dependent normalization strategies for untargeted metabolomics—a case study

Abstract: Despite the recent advances in the standardization of untargeted metabolomics workflows, there is still a lack of attention to specific data treatment strategies that require deep knowledge of the biological problem and need to be applied after a well-thought out process to understand the effect of the practice. One of those strategies is data normalization. Data-driven assumptions are critical especially addressing unwanted variation present in the biological model as it can be the case in heterogeneous tissu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
17
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 29 publications
(17 citation statements)
references
References 43 publications
0
17
0
Order By: Relevance
“…Subsequently, in order to reduce the unwanted systematic bias, so that only biologically relevant variations are present in the data and that all samples become comparable in terms of absolute intensities, a normalization step is usually performed [ 103 ]. The selection of an appropriate normalization method depends on the type of sample to be analyzed and, over time, several methods have been reported; namely, the use of one or multiple internal standards, total metabolite signal, total protein content, DNA concentration, osmolality, urine creatinine, and probabilistic quotient normalization (PQN), among other algorithms [ 103 , 104 , 105 , 106 ]. The main strengths and limitations of each normalization method have been described elsewhere [ 103 , 107 ].…”
Section: Metabolomics Workflowmentioning
confidence: 99%
“…Subsequently, in order to reduce the unwanted systematic bias, so that only biologically relevant variations are present in the data and that all samples become comparable in terms of absolute intensities, a normalization step is usually performed [ 103 ]. The selection of an appropriate normalization method depends on the type of sample to be analyzed and, over time, several methods have been reported; namely, the use of one or multiple internal standards, total metabolite signal, total protein content, DNA concentration, osmolality, urine creatinine, and probabilistic quotient normalization (PQN), among other algorithms [ 103 , 104 , 105 , 106 ]. The main strengths and limitations of each normalization method have been described elsewhere [ 103 , 107 ].…”
Section: Metabolomics Workflowmentioning
confidence: 99%
“…The obtained data matrix was then manually inspected in Microsoft Excel (Microsoft Office 2010). Statistical analysis was conducted by retaining only such LC-MS features, as shared by more than 50% of samples (Cuevas-Delgado et al 2020). The peak intensity of individual LC-MS feature in each sample was normalized to peak intensity of total useful signal (Cuevas-Delgado et al 2020).…”
Section: Lc-ms Analysis and Data Processingmentioning
confidence: 99%
“…Statistical analysis was conducted by retaining only such LC-MS features, as shared by more than 50% of samples (Cuevas-Delgado et al 2020). The peak intensity of individual LC-MS feature in each sample was normalized to peak intensity of total useful signal (Cuevas-Delgado et al 2020). Multivariate (PCA and PLS-DA) and univariate (ANOVA) statistical analyses were carried out by the metaboanalyst (www.…”
Section: Lc-ms Analysis and Data Processingmentioning
confidence: 99%
“…This constant is the median of all detected signal intensities of a sample. Compared to the mean value, the median has the advantage that it is more robust against individual outliers [ 103 , 104 , 105 ].…”
Section: From Non-targeted Data Sets To Marker Compoundsmentioning
confidence: 99%