2013
DOI: 10.1021/cr3004339
|View full text |Cite
|
Sign up to set email alerts
|

Descriptor Selection Methods in Quantitative Structure–Activity Relationship Studies: A Review Study

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
114
0
2

Year Published

2014
2014
2020
2020

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 185 publications
(117 citation statements)
references
References 65 publications
1
114
0
2
Order By: Relevance
“…the lowest ratio of the number of NMs to the number of descriptors should be 5 to 1, and the dependence of its performance on redundant variables, i.e. the presence of correlated input variables, as well as input variables that are irrelevant to the output, may lead to poor model performance (Shahlaei, 2013). Dimension reduction methods such as PCA can be useful for eliminating correlations between input variables (i.e.…”
Section: Decision Trees (Dts) Automatic Generation Of Decision Treesmentioning
confidence: 99%
“…the lowest ratio of the number of NMs to the number of descriptors should be 5 to 1, and the dependence of its performance on redundant variables, i.e. the presence of correlated input variables, as well as input variables that are irrelevant to the output, may lead to poor model performance (Shahlaei, 2013). Dimension reduction methods such as PCA can be useful for eliminating correlations between input variables (i.e.…”
Section: Decision Trees (Dts) Automatic Generation Of Decision Treesmentioning
confidence: 99%
“…In this procedure, the independent variables (descriptors) which have high correlation with the experimental values (such as IC 50 ) added step by step to a regression equation and significance of each added variable tested in order to remain in the final multi-variate equation. If the joined descriptors have not any significant value in the regression equation, it will be removed (Shahlaei, 2013). The SPSS 16.0 was used for the computaion probability of F as stepping criteria (F = 0.05 and 0.1 for entry and removal, respectively).…”
Section: Stepwise Regression Methodsmentioning
confidence: 99%
“…The leave-one-out (LOO) crossvalidation method [29][30][31][32] was used to find out the optimal set of spectral lines and to estimate the regression model predictivity. We used spectra of 36 pellets with nine different heating values for calibration, and for testing, we left out a subset of spectra of four pellets of the same sample having the known heating value from calorimetric measurements.…”
Section: Cross-validationmentioning
confidence: 99%