2013
DOI: 10.1080/01621459.2012.754359
|View full text |Cite
|
Sign up to set email alerts
|

Learning Sparse Causal Gaussian Networks With Experimental Intervention: Regularization and Coordinate Descent

Abstract: Causal networks are graphically represented by directed acyclic graphs (DAGs). Learning causal networks from data is a challenging problem due to the size of the space of DAGs, the acyclicity constraint placed on the graphical structures, and the presence of equivalence classes. In this article, we develop an L 1 -penalized likelihood approach to estimate the structure of causal Gaussian networks. A blockwise coordinate descent algorithm, which takes advantage of the acyclicity constraint, is proposed for seek… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
129
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 70 publications
(130 citation statements)
references
References 32 publications
1
129
0
Order By: Relevance
“…Zou (2006) showed that the adaptive lasso satisfies the consistency in model selection if à ij is a n-consistent estimate of A ij and suggested using the ordinary least squares (OLS) estimate for à ij . Fu and Zhou (2013) proposed the ordinary least squares (OLS) estimate with an upper bound, which is false(1false|Ãij|γ,1false(N1false)γfalse) with N = 10 4 . However, if there are correlations among variables, the estimates from OLS are unstable.…”
Section: Problem Formulationmentioning
confidence: 99%
See 3 more Smart Citations
“…Zou (2006) showed that the adaptive lasso satisfies the consistency in model selection if à ij is a n-consistent estimate of A ij and suggested using the ordinary least squares (OLS) estimate for à ij . Fu and Zhou (2013) proposed the ordinary least squares (OLS) estimate with an upper bound, which is false(1false|Ãij|γ,1false(N1false)γfalse) with N = 10 4 . However, if there are correlations among variables, the estimates from OLS are unstable.…”
Section: Problem Formulationmentioning
confidence: 99%
“…The formula (8) provides the lower and upper bound of 1 and N γ , respectively. In our simulation study, we use N = 10 4 as Fu and Zhou (2013) did. We construct the initial estimates à ij from the regular lasso estimates by minimizing function (8) with a certain λ 0 , γ, and w ij = 1.…”
Section: Problem Formulationmentioning
confidence: 99%
See 2 more Smart Citations
“…However, it tends to yields a large number of false positives in the sparse network problem, as pointed out by Fu and Zhou in their seminal paper (Fu & Zhou, 2013). Fu and Zhou proposed an "elbow method" that outperforms the cross-validation method, where the optimal tuning parameter corresponds to the change point at which an increase of λ does not yield a substantial decrease of log-likelihood.…”
Section: Cox Proportional Hazard Model With Sparse Group Lasso Penaltymentioning
confidence: 99%