DOI: 10.32469/10355/56468
|View full text |Cite
|
Sign up to set email alerts
|

Partial membership latent Dirichlet allocation

Abstract: Topic models (e.g., pLSA, LDA, sLDA) have been widely used for segmenting imagery. However, these models are confined to crisp segmentation, forcing a visual word (i.e., an image patch) to belong to one and only one topic. Yet, there are many images in which some regions cannot be assigned a crisp categorical label (e.g., transition regions between a foggy sky and the ground or between sand and water at a beach). In these cases, a visual word is best represented with partial memberships across multiple topics.… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
0

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 17 publications
(18 citation statements)
references
References 19 publications
0
18
0
Order By: Relevance
“…Latent Dirichlet Allocation (LDA) described in [4,5] is a statistical topic model that discovers the hidden theme or topics from a collection of documents. Assuming an imaginary generative process of constructing these documents, LDA then tries to backtrack from these documents to infer a set of topics that are likely to have generated the collection [6]. While being a powerful tool at discovering the thematic structure of text, LDA makes certain assumptions, which make it inappropriate when considering the semantics of a language.…”
Section: Related Workmentioning
confidence: 99%
“…Latent Dirichlet Allocation (LDA) described in [4,5] is a statistical topic model that discovers the hidden theme or topics from a collection of documents. Assuming an imaginary generative process of constructing these documents, LDA then tries to backtrack from these documents to infer a set of topics that are likely to have generated the collection [6]. While being a powerful tool at discovering the thematic structure of text, LDA makes certain assumptions, which make it inappropriate when considering the semantics of a language.…”
Section: Related Workmentioning
confidence: 99%
“…The appropriate number of topics for our dataset was determined by comparing the intra-topic similarity with inter-topic dissimilarity. Analyses specifying five, ten, fifteen, and twenty topic solutions were run, and the optimum number of topics was determined to be twenty as this maximized the difference between topics (Chen and Wang, 2018). Words in topics were generated by determining the mix between the probability and relevance of words belonging to topics (Sievert and Shirley, 2014).…”
Section: Citation Analysismentioning
confidence: 99%
“…Partial Membership Latent Dirichlet Allocation (PM-LDA) [26,27] is an extension of Latent Dirichlet Allocation topic modeling [28] that allows words to have partial membership across multiple topics. The use of partial memberships allows for topic modeling given data sets in which crisp topic assignments (as done by LDA) is insufficient since data points (or words) may straddle multiple topics simultaneously.…”
Section: Partial Membership Latent Dirichlet Allocationmentioning
confidence: 99%