1994
DOI: 10.2307/2275402
|View full text |Cite
|
Sign up to set email alerts
|

Machine learning of higher-order programs

Abstract: A generator program for a computable function (by definition) generates an infinite sequence of programs all but finitely many of which compute that function. Machine learning of generator programs for computable functions is studied. To partially motivate these studies, it is shown that, in some cases, interesting global properties for computable functions can be proved from suitable generator programs which can not be proved from any ordinary programs for them. The power (for variants of various learning cri… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
6
0

Year Published

1994
1994
2012
2012

Publication Types

Select...
3
3

Relationship

3
3

Authors

Journals

citations
Cited by 11 publications
(6 citation statements)
references
References 21 publications
0
6
0
Order By: Relevance
“…2000). The J48 decision tree inducer, based on the C4.5 algorithm, was implemented with the parameter ‘MinNumObj’ set at a value of 11 to limit the complexity of theories and minimize the risk of over-fitting (Baliga et al 1992). Classifiers were evaluated using 100 iterations of stratified 10-fold cross-validations, a procedure designed to conservatively reflect the performance of classification models on novel data sets (Witten and Frank.…”
Section: Methodsmentioning
confidence: 99%
“…2000). The J48 decision tree inducer, based on the C4.5 algorithm, was implemented with the parameter ‘MinNumObj’ set at a value of 11 to limit the complexity of theories and minimize the risk of over-fitting (Baliga et al 1992). Classifiers were evaluated using 100 iterations of stratified 10-fold cross-validations, a procedure designed to conservatively reflect the performance of classification models on novel data sets (Witten and Frank.…”
Section: Methodsmentioning
confidence: 99%
“…This is interesting since, for example, if the universe is completely algorithmic, then all real language texts generated by parents for their children are recursive! 7 Definition 2. -identification are required to be "nearly" minimal size and, hence, even more likely to fit in one's head.…”
Section: Preliminariesmentioning
confidence: 99%
“…In this way results about learning programs for functions can be interpreted as results about finding predictive explanations for phenomena-as results about scientific induction. For more on this see [3,31,22,7,21,56]. Regarding the names of the learning criteria studied in the present paper, originally [31] "Ex" stood for "explanatory," "Fex" stood for "finitely explanatory," and Bc for "behaviorally correct."…”
mentioning
confidence: 98%
See 1 more Smart Citation
“…26 Fulk [Ful85] argues that the set of distinguishable experiments one can actually do and record on a phenomenon is countable: lab manuals can and do contain only finite notations from a finite alphabet and/or bounded-size, finite-precision images. 27 The difference is somewhat analogous to the difference between predicting the location of planet at any time and predicting the shape of the planet's orbit [CJS92,BCJS94]. 28 The trick is to express the limiting computable partial functions as the uniform limit of a single, suitable computable function [RC94,CS06].…”
Section: Theorem 2 (Bārzdiņš [B74] Blum and Blum [Bb75])mentioning
confidence: 99%