1989
DOI: 10.1007/bf00114804
|View full text |Cite
|
Sign up to set email alerts
|

On learning sets and functions

Abstract: This paper presents some results on the probabilistic analysis of learning, illustrating the applicability of these results to settings such as connectionist networks. In particular, it concerns the learning of sets and functions flom examples and background information. After a formal statement of the problem, some theorems are provided identifying the conditions necessary and sufficient for efficient learning, with respect to measures of information complexity and computational complexity. Intuitive interpre… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
115
0

Year Published

1994
1994
2019
2019

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 116 publications
(117 citation statements)
references
References 17 publications
2
115
0
Order By: Relevance
“…In some work on function learning, such as [17,7,11], a dimension known as the graph dimension has proven to be useful. The graph dimension of a class H of functions that map from X to a set Y is the VC-dimension of the class…”
Section: Propositionmentioning
confidence: 99%
“…In some work on function learning, such as [17,7,11], a dimension known as the graph dimension has proven to be useful. The graph dimension of a class H of functions that map from X to a set Y is the VC-dimension of the class…”
Section: Propositionmentioning
confidence: 99%
“…Given a particular determination, we estimate the size of the function space consistent with the determination. The main result in PAC learning that we use is the dimensionality theorem, which relates the number of training examples needed for successful learning to the size of the function space [Blumer et al, 1989, Natarajan, 1989. Informally, this theorem says that the number of examples sufficient for successful learning varies logarithmically with the asymptotic size of the function space.…”
Section: Informal Overview Of the Approachmentioning
confidence: 99%
“…The relationship of Natarajan's dimension to the more popular Vapnik-Chervonenkis dimension [Blumer et al, 1989] is discussed in [Natarajan, 1989]. To calculate the dimension of the function space B in our boolean function example above, we note that there are 2 2n.…”
Section: Definition 6 a F Unction F Is Consistent With A Set Of Exammentioning
confidence: 99%
“…As alternative definitions, a variety of notions of dimension to classes of {0,..., m}-valued functions had been proposed [4,7], and Ben-David et al gave a general scheme [2] which unified them. They introduced ~-dimension, where k~ is a family of mappings which translate {0,..., m}-valued functions into {0, 1}-valued ones.…”
Section: Complexity Of ~-Dimension Problemsmentioning
confidence: 99%
“…It is well-known that the Vapnik-Chervonenkis dimension (VC-dimension) which is a combinatorial parameter of a class of binary functions plays the key role to determine whether the class is polynomial-sample learnable or not [3,5,8]. As a natural extension, the learnability of a class of {0,..., m}-valued functions has been characterized by various generalized notions such as pseudodimension [4], graph dimension [7], and Natarajan dimension [7]. Ben-David et al [2] unified them into a general scheme, by introducing a family 9 of mappings which translate {0,..., m}-valued functions into binary ones.…”
Section: Introductionmentioning
confidence: 99%