2020
DOI: 10.1145/3412371
|View full text |Cite
|
Sign up to set email alerts
|

Bi-Directional Recurrent Attentional Topic Model

Abstract: In a document, the topic distribution of a sentence depends on both the topics of its neighbored sentences and its own content, and it is usually affected by the topics of the neighbored sentences with different weights. The neighbored sentences of a sentence include the preceding sentences and the subsequent sentences. Meanwhile, it is natural that a document can be treated as a sequence of sentences. Most existing works for Bayesian document modeling do not take these points into consideration. To fill this … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 19 publications
(13 citation statements)
references
References 50 publications
0
13
0
Order By: Relevance
“…Specifically, web services are described by a specific semantic tagged language, e.g., SAWSDL (Semantic Annotations for Web services description language) and OWL-S (Web Ontology Language for Services). The approaches based on a topic model are another kind of semantics-aware service discovery approaches, and many related approaches have been proposed in recent years [3,17,18]. LDA-SVM [22] is proposed to handle the issue of labeling a large number of services when a service classifier is trained.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations
“…Specifically, web services are described by a specific semantic tagged language, e.g., SAWSDL (Semantic Annotations for Web services description language) and OWL-S (Web Ontology Language for Services). The approaches based on a topic model are another kind of semantics-aware service discovery approaches, and many related approaches have been proposed in recent years [3,17,18]. LDA-SVM [22] is proposed to handle the issue of labeling a large number of services when a service classifier is trained.…”
Section: Related Workmentioning
confidence: 99%
“…Different from the conventional methods on service discovery or clustering with topic modeling, the bi-SWTM assumes that the topic distribution of each sentence in service descriptions is not only determined by the concepts of the involved words but is also influenced by its preceding and subsequent sentences. This is the basic assumption in many text mining tasks, such as topic modeling [17] and neural language modeling [10]. In traditional topic modeling, the topics of a text follow a special distribution a prior such as a Dirichlet [3] or Normal [2] distributions.…”
Section: Bi-directional Sentence-word Topic Model For Service Discoverymentioning
confidence: 99%
See 2 more Smart Citations
“…Nowadays, the representatives include Neural Variational Document Model (NVDM) (Miao et al, 2016), Product of expert LDA (ProdLDA) (Srivastava and Sutton, 2017), and Embedded Topic Model (ETM) (Dieng et al, 2020), etc. Besides these "naive" neural variants of LDA, many other models have been investigated by applying (1) various neural modules to the topic encoder, e.g., recurrent module (Rezaee and Ferraro, 2020), attention mechanism (Li et al, 2020b), and graphical connection (Zhu et al, 2018;Yang et al, 2020), and (2) new learning paradigms, e.g., adversarial training (Wang et al, 2019), reinforcement learning (Gui et al, 2019), and lifelong learning (Gupta et al, 2020). However, despite their effectiveness on normal long texts, those models suffer from the sparsity problem of short texts (Zeng et al, 2018).…”
Section: Neural Topic Modelingmentioning
confidence: 99%