2018
DOI: 10.1109/tsp.2018.2813337
|View full text |Cite
|
Sign up to set email alerts
|

Learning Graphs With Monotone Topology Properties and Multiple Connected Components

Abstract: Recent papers have formulated the problem of learning graphs from data as an inverse covariance estimation with graph Laplacian constraints. While such problems are convex, existing methods cannot guarantee that solutions will have specific graph topology properties (e.g., being a tree or k-partite), which are desirable for some applications. In fact, the problem of learning a graph with given topology properties, e.g., finding the k-partite graph that best matches the data, is in general nonconvex. In this pa… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
36
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 48 publications
(36 citation statements)
references
References 50 publications
0
36
0
Order By: Relevance
“…Interestingly, this problem can be tackled using a convexification technique known as semidefinite relaxation [31]. More precisely, (41) can be recast as a Boolean quadratic program and then equivalently expressed as a semidefinite program subject to a rank constraint. Dropping this latter constraint, one arrives at a convex relaxation with provable approximation bounds; see [56] for full algorithmic, complexity, and performance details.…”
Section: Diffused Non-stationary Graph Signalsmentioning
confidence: 99%
See 3 more Smart Citations
“…Interestingly, this problem can be tackled using a convexification technique known as semidefinite relaxation [31]. More precisely, (41) can be recast as a Boolean quadratic program and then equivalently expressed as a semidefinite program subject to a rank constraint. Dropping this latter constraint, one arrives at a convex relaxation with provable approximation bounds; see [56] for full algorithmic, complexity, and performance details.…”
Section: Diffused Non-stationary Graph Signalsmentioning
confidence: 99%
“…Gaussian models are ubiquitous in machine learning and statistical analysis of real-valued network data, because of their widespread applicability and analytical tractability. Most recent advances to GMRF model selection have explored ways of incorporating Laplacian or otherwise graph topological constraints in the precision matrix estimation task [11], [21], [41], [43]. These approaches are well suited to settings when prior information dictates that e.g., feasible graphs should have a tree structure or edge weights should be positive given the physics of the problem.…”
Section: A Graph Signal Models and Their Relationshipsmentioning
confidence: 99%
See 2 more Smart Citations
“…Third, although the current lines of work reviewed in this survey mainly focus on the signal representation, it is also possible to put constraints directly on the learned graphs by enforcing certain graph properties that go beyond the common choice of sparsity, which has been adopted explicitly in the optimization problems in many existing methods such as the ones in [15], [25], [42], [45], [46], [55], [62]. One example is the work in [82], where the authors have proposed to infer graphs with monotone topology properties.…”
Section: B Outcome Of Learning Frameworkmentioning
confidence: 99%