2019 34th Annual ACM/IEEE Symposium on Logic in Computer Science (LICS) 2019
DOI: 10.1109/lics.2019.8785811
|View full text |Cite
|
Sign up to set email alerts
|

Learning Concepts Definable in First-Order Logic with Counting

Abstract: We study classification problems over relational background structures for hypotheses that are defined using logics with counting. The aim of this paper is to find learning algorithms running in time sublinear in the size of the background structure. We show that hypotheses defined by FOCN(P)formulas over structures of polylogarithmic degree can be learned in sublinear time. Furthermore, we prove that for structures of unbounded degree there is no sublinear learning algorithm for first-order formulas.

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 19 publications
0
4
0
Order By: Relevance
“…Other recent work studies the parameterized complexity of learning queries in FO [van Bergerem et al 2022], algorithms for learning in FO with counting [van Bergerem 2019], and learning in description logics [Funk et al 2019]. Applications for FO learning have emerged, e.g., synthesizing invariants [Garg et al 2014Hance et al 2021;Koenig et al 2020Koenig et al , 2022Yao et al 2021] and learning program properties [Astorga et al 2019[Astorga et al , 2021Miltner et al 2020].…”
Section: Related Workmentioning
confidence: 99%
“…Other recent work studies the parameterized complexity of learning queries in FO [van Bergerem et al 2022], algorithms for learning in FO with counting [van Bergerem 2019], and learning in description logics [Funk et al 2019]. Applications for FO learning have emerged, e.g., synthesizing invariants [Garg et al 2014Hance et al 2021;Koenig et al 2020Koenig et al , 2022Yao et al 2021] and learning program properties [Astorga et al 2019[Astorga et al , 2021Miltner et al 2020].…”
Section: Related Workmentioning
confidence: 99%
“…FO, FOW 1 (P), FOWA 1 (P), FOWA(P)), let σ be a signature, and let Φ ⊆ L[σ, S, W] be a set of formulas ϕ(x, ȳ) with |x| = k and |ȳ| = . For a (σ, W)-structure A, we follow the same approach as [4,7,8,10,24] and consider the instance space X = A k and concepts from the concept class C(Φ, A, k,…”
Section: Learning Concepts On Weighted Structuresmentioning
confidence: 99%
“…To obtain a reasonable running time, we intend to find algorithms that compute a hypothesis in sublinear time, measured in the size of the background structure. This local access model has already been studied for relational structures in [8,24] for concepts definable in FO or in FOCN(P). Modifications of the local access model for strings and trees have been studied in [4,7].…”
Section: Learning Concepts On Weighted Structuresmentioning
confidence: 99%
See 1 more Smart Citation