2003
DOI: 10.1002/qsar.200310004
|View full text |Cite
|
Sign up to set email alerts
|

Genetic Neural Networks for Functional Approximation

Abstract: Quantitative structure-activity relationship (QSAR) analysis is a commonly used ligand-based molecular design method for the lead optimization process in the pharmaceutical industry. Typically, development of a QSAR model goes through the following stages: descriptor generation, function approximation (including feature selection and model construction), and model validation. This article highlights a promising genetic neural network (GNN) method for function approximation in QSAR and reviews its underlying th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2005
2005
2010
2010

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 11 publications
(3 citation statements)
references
References 42 publications
0
3
0
Order By: Relevance
“…The option to employ elitism 45 was added. This carries the best user-specified number of trees through to the next generation.…”
Section: The Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The option to employ elitism 45 was added. This carries the best user-specified number of trees through to the next generation.…”
Section: The Methodsmentioning
confidence: 99%
“…Genetic algorithms have been used in decision tree generation, to decide the splitting points and attributes to be used while growing a tree. , In contrast, EPTree uses evolutionary programming, sometimes called genetic programming to explore combinations of splitting nodes forming the decision trees themselves. This approach has all the benefits of genetic algorithms, including simultaneous investigation of many solutions and the avoidance of local minima, but does not require parameter encoding into fixed length vectors called chromosomes. Furthermore, the direct action on the decision trees themselves avoids some of the problems usually associated with decision tree induction from genetic programming, such as the tendency to grow overly large trees, known as bloat. , …”
Section: Introductionmentioning
confidence: 99%
“…In order to implement this rule we used fixed number of input neurons [4,7]. According to another rule of thumb used by Chiu and So [39] and Patankar and Jurs [10] the number of training objects in this case molecules should be approximately twice the number of NN adjustable parameters. Similar or even more demanding rules could be found in NN literature [36].…”
Section: Nn Implementationmentioning
confidence: 99%