2020
DOI: 10.1109/jsait.2020.3040552
|View full text |Cite
|
Sign up to set email alerts
|

Exact Asymptotics for Learning Tree-Structured Graphical Models With Side Information: Noiseless and Noisy Samples

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
15
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2

Relationship

3
3

Authors

Journals

citations
Cited by 7 publications
(15 citation statements)
references
References 19 publications
0
15
0
Order By: Relevance
“…The correlations between adjacent nodes ρ are set to 0.9, 0.7, and 0.5. Theorem 3 in [TTZ20] provides an exact asymptotic expression for the best possible error probability, and it serves as a baseline of the simulation; this is indicated as "Passive SCL: Theory" in Figs. 3, 4, and 5.…”
Section: Simulation Resultsmentioning
confidence: 99%
See 4 more Smart Citations
“…The correlations between adjacent nodes ρ are set to 0.9, 0.7, and 0.5. Theorem 3 in [TTZ20] provides an exact asymptotic expression for the best possible error probability, and it serves as a baseline of the simulation; this is indicated as "Passive SCL: Theory" in Figs. 3, 4, and 5.…”
Section: Simulation Resultsmentioning
confidence: 99%
“…3, 4, and 5. Note that no other passive algorithm can perform better than that given by Theorem 3 in [TTZ20] (since this is based on the maximum likelihood or minimum error probability principle), so it also serves as a bona fide impossibility result for tree structure learning using passive strategies. deficiencies of the passive learning algorithm, especially when ρ is close to 1.…”
Section: Simulation Resultsmentioning
confidence: 99%
See 3 more Smart Citations