16th IEEE International Conference on Tools With Artificial Intelligence
DOI: 10.1109/ictai.2004.55
|View full text |Cite
|
Sign up to set email alerts
|

Efficient learning of hierarchical latent class models

Abstract: Hierarchical latent class (HLC)

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
43
0

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 35 publications
(43 citation statements)
references
References 2 publications
0
43
0
Order By: Relevance
“…To answer this question, SHC was tested on one of the synthetic data sets mentioned in Section 5.1 [20]. SHC turned out to be 22 times faster than DHC, and it obtained the same model as DHC.…”
Section: Empirical Results With Shc and Hshcmentioning
confidence: 99%
See 2 more Smart Citations
“…To answer this question, SHC was tested on one of the synthetic data sets mentioned in Section 5.1 [20]. SHC turned out to be 22 times faster than DHC, and it obtained the same model as DHC.…”
Section: Empirical Results With Shc and Hshcmentioning
confidence: 99%
“…Can HSHC find high quality models? To answer those questions, experiments were conducted on five synthetic data sets with 6, 9, 12, 15 and 18 manifest variables, respectively and 10,000 records [20]. For the top-K scheme in HSHC, three values were used for K, namely 1, 2 and 3.…”
Section: Empirical Results With Shc and Hshcmentioning
confidence: 99%
See 1 more Smart Citation
“…The algorithms we consider are abbreviated in this section as follows: algorithm IND generates baseline results using the model in which all variables are independent of each other. Algorithm ZHANG is the compiled Java code provided by N. L. Zhang implementing the method described in Zhang and Kočka (2004). Algorithm LCM estimates a non-hierarchical Latent Class Model inferring a single latent variable.…”
Section: Methodsmentioning
confidence: 99%
“…It can deal with data sets with about one dozen manifest variables. Zhang and Kočka [8] recently proposed another algorithm called heuristic single hill-climbing (HSHC). HSHC combines the two search routines of DHC into one and incorporates the idea of structural EM [2] to reduce the time spent in parameter optimization.…”
Section: Hirarchical Latent Class Modelsmentioning
confidence: 99%