1986
DOI: 10.1007/bf00336991
|View full text |Cite
|
Sign up to set email alerts
|

Linear function neurons: Structure and training

Abstract: Three different representations for a thresholded linear equation are developed. For binary input they are shown to be representationally equivalent though their training characteristics differ. A training algorithm for linear equations is discussed. The similarities between its simplest mathematical representation (perceptron training), a formal model of animal learning (Rescorla-Wagner learning), and one mechanism of neural learning (Aplysia gill withdrawal) are pointed out. For d input features, perceptron … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
35
1

Year Published

1989
1989
2015
2015

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 57 publications
(36 citation statements)
references
References 40 publications
0
35
1
Order By: Relevance
“…[Hon87,Rag88]). Since there are 2 Ω(n 2 ) halfspaces over {0, 1} n a counting argument shows that there exist halfspaces over {0, 1} n that require weight 2 Ω(n) , and specific halfspaces that require weight 2 Ω(n) have been known for decades [MP68,HV86]. Håstad [Hås94] exhibited a specific halfspace that has weight n Ω(n) and his construction was subsequently refined in [AV97].…”
Section: Previous Work and Our Resultsmentioning
confidence: 99%
“…[Hon87,Rag88]). Since there are 2 Ω(n 2 ) halfspaces over {0, 1} n a counting argument shows that there exist halfspaces over {0, 1} n that require weight 2 Ω(n) , and specific halfspaces that require weight 2 Ω(n) have been known for decades [MP68,HV86]. Håstad [Hås94] exhibited a specific halfspace that has weight n Ω(n) and his construction was subsequently refined in [AV97].…”
Section: Previous Work and Our Resultsmentioning
confidence: 99%
“…n that require weight 2 Ω(n) , and specific halfspaces that require weight 2 Ω(n) have been known for decades [9,5]. [6] exhibited a specific halfspace that has weight n Ω(n) and his construction was subsequently refined by [1].…”
Section: Previous Work and Our Resultsmentioning
confidence: 99%
“…In formulating decision trees, various impact related variables are analyzed, using previously built in models that are related with testing data to predict future events (Smith & Tansley, 2003). Decision trees are used to predict probable outcomes, such as forecasting company's future, customers' intentions and likely decisions (Potter & Potter, 1988;Fayyad & Irani, 1992;Hampson & Volper, 1986).…”
Section: Overview Of Related Studiesmentioning
confidence: 99%
“…Hampson and Volper (1986) applied decision trees in structuring and training linear function neurons in biology, Friedl and Brodley (1997) used decision trees to classify land cover from remotely sensed data while Hautaniemi et al (2005) modelled a signal response with decision tree analysis. Tso and Yau (2007) also predicted the consumption of electricity using decision trees while Goodwin, Wrights and Philips (2004) analyzed management judgments using decision tree classification.…”
Section: Overview Of Related Studiesmentioning
confidence: 99%