2010
DOI: 10.1007/s10994-009-5163-1
|View full text |Cite
|
Sign up to set email alerts
|

Fast learning of relational kernels

Abstract: We develop a general theoretical framework for statistical logical learning with kernels based on dynamic propositionalization, where structure learning corresponds to inferring a suitable kernel on logical objects, and parameter learning corresponds to function learning in the resulting reproducing kernel Hilbert space. In particular, we study the case where structure learning is performed by a simple FOIL-like algorithm, and propose alternative scoring functions for guiding the search process. We present an … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
68
0

Year Published

2010
2010
2020
2020

Publication Types

Select...
5
4

Relationship

2
7

Authors

Journals

citations
Cited by 57 publications
(68 citation statements)
references
References 55 publications
0
68
0
Order By: Relevance
“…In most cases, these methods aim to develop a suitable representation of structured data for subsequent learning, not to discover features. There are previous works, however, where the feature space itself is learned from relational data [23,29] and is interpretable in terms of definite clauses.…”
Section: Resultsmentioning
confidence: 99%
“…In most cases, these methods aim to develop a suitable representation of structured data for subsequent learning, not to discover features. There are previous works, however, where the feature space itself is learned from relational data [23,29] and is interpretable in terms of definite clauses.…”
Section: Resultsmentioning
confidence: 99%
“…The model, referred to as kFOIL, implements a dynamic propositionalization approach and allows one to perform both classification and regression tasks. In (Landwehr et al 2010), a general theoretical framework for statistical logical learning with kernels based on dynamic propositionalization is developed where structure learning corresponds to inferring a suitable kernel on logical objects, and parameter learning corresponds to function learning in the resulting reproducing kernel Hilbert space.…”
Section: Related Workmentioning
confidence: 99%
“…This means that using ILP systems just based on default values for parametersthe accepted practice at present-can give misleading estimates of the best response possible from the system. This is illustrated in Figure 13, which shows estimated accuracies on other data sets reported in the literature that also use the Aleph system with default values for all parameters (these data sets have been used widely: see, for example, Landwehr et al, 2006 andMuggleton et al, 2008). Taken with our previous results for the mutagenesis and carcinogenesis data (we will only use the B max results, as these are the results used in the literature), we are now able to make some statements of statistical significance.…”
Section: Resultsmentioning
confidence: 99%