2016
DOI: 10.1002/minf.201600118
|View full text |Cite|
|
Sign up to set email alerts
|

Performance of Deep and Shallow Neural Networks, the Universal Approximation Theorem, Activity Cliffs, and QSAR

Abstract: Neural networks have generated valuable Quantitative Structure-Activity/Property Relationships (QSAR/QSPR) models for a wide variety of small molecules and materials properties. They have grown in sophistication and many of their initial problems have been overcome by modern mathematical techniques. QSAR studies have almost always used so-called "shallow" neural networks in which there is a single hidden layer between the input and output layers. Recently, a new and potentially paradigm-shifting type of neural… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
66
2

Year Published

2017
2017
2024
2024

Publication Types

Select...
5
1
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 96 publications
(70 citation statements)
references
References 39 publications
2
66
2
Order By: Relevance
“…Neural network and other machine learning methods can generate models quickly and effectively and require few or no assumptions to be made about the form of the mathematical relationship, as they are universal approximators [27,28]. The rise of deep learning algorithms within the last five years has stimulated the use of neural network and machine learning approaches substantially [4,5,29].…”
Section: Machine Learning Modelling Methodsmentioning
confidence: 99%
“…Neural network and other machine learning methods can generate models quickly and effectively and require few or no assumptions to be made about the form of the mathematical relationship, as they are universal approximators [27,28]. The rise of deep learning algorithms within the last five years has stimulated the use of neural network and machine learning approaches substantially [4,5,29].…”
Section: Machine Learning Modelling Methodsmentioning
confidence: 99%
“…One such re‐imagining of an older concept is in the burgeoning field of deep learning. Certain characteristics of this set of methods enable and encourage their nascent applications in the field of molecular design ,. Essentially, deep learning models are universal function estimators and hypothesis generators:…”
Section: Figurementioning
confidence: 99%
“…Scholars have used various classifiers in medical brain image analysis, such as decision tree, support vector machine [25], and naive Bayesian classifier. Nevertheless, the feedforward neural network (FNN) won remarkable success, because of the universal approximation theorem [26], which says the following. Suppose is a bounded and nonconstant continuous function.…”
Section: The Classifier Construction Based On a Feedforward Neural Nementioning
confidence: 99%