2001
DOI: 10.1145/383779.383781
|View full text |Cite
|
Sign up to set email alerts
|

An extended transformation approach to inductive logic programming

Abstract: Inductive logic programming (ILP) is concerned with learning relational descriptions that typically have the form of logic programs. In a transformation approach, an ILP task is transformed into an equivalent learning task in a different representation formalism. Propositionalization is a particular transformation method, in which the ILP task is compiled to an attribute-value learning task. The main restriction of propositionalization methods such as LINUS is that they are unable to deal with nondeterminate l… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
39
0
3

Year Published

2002
2002
2014
2014

Publication Types

Select...
4
2
1

Relationship

2
5

Authors

Journals

citations
Cited by 65 publications
(42 citation statements)
references
References 48 publications
0
39
0
3
Order By: Relevance
“…Furthermore, additional experiments with classificatory induction using the generated propositional form indicate that RSD produces features leading to high classification accuracy. Lastly, absolute-runtime comparisons with SINUS which implements the propositionalization procedure described in Flach and Lachiche (1999) and Lavrač and Flach (2001) indicate the superiority of RSD, although these figures are not conclusive due to the inherently different software and hardware used by RSD and SINUS.…”
Section: Conclusion and Further Workmentioning
confidence: 98%
See 3 more Smart Citations
“…Furthermore, additional experiments with classificatory induction using the generated propositional form indicate that RSD produces features leading to high classification accuracy. Lastly, absolute-runtime comparisons with SINUS which implements the propositionalization procedure described in Flach and Lachiche (1999) and Lavrač and Flach (2001) indicate the superiority of RSD, although these figures are not conclusive due to the inherently different software and hardware used by RSD and SINUS.…”
Section: Conclusion and Further Workmentioning
confidence: 98%
“…The main contributions of this paper concern the transfer of this methodology to the multi-relational learning setting. The contributions include substantial improvements of the propositionalization step (compared to the propositionalization proposed by Flach and Lachiche (1999) and Lavrač and Flach (2001)) and an effective implementation of relational subgroup discovery algorithm RSD, employing language and evaluation constraints. Further contributions concern the analysis of the RSD subgroup discovery algorithm in the ROC space, and the successful application of RSD to standard ILP problems (East-West trains, King-Rook-King chess endgame and mutagenicity prediction) and two real-life problem domains (analysis of telephone calls and analysis of traffic accidents).…”
Section: Are As Large As Possible and Have The Most Unusual Statisticmentioning
confidence: 99%
See 2 more Smart Citations
“…A relational database consists of a set of named tables, often referred to as relations that individually behave as the single table that is the subject of Propositional Data Mining [5]. Data structures more complex than a single record are implemented by relating pairs of tables through so-called foreign key relations [6]. Such a relation specifies how certain columns in one table can be used to look up information in corresponding columns in the other table, thus relating sets of records in the Tables 1 and 2 [7,8] .…”
Section: Relational Databasesmentioning
confidence: 99%