Proceedings of the 2nd Workshop on Data Mining Using Matrices and Tensors 2009
DOI: 10.1145/1581114.1581117
|View full text |Cite
|
Sign up to set email alerts
|

Sequential latent semantic indexing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 5 publications
0
4
0
Order By: Relevance
“…Statistical approaches use only the given textual context without any additional meta-information. Various models have been already investigated in this direction: topic modeling (Prabhakaran et al, 2016;Uban et al, 2021;Krivenko and Vasilyev, 2009), clustering approaches (Mei and Zhai, 2005;Behpour et al, 2021), and so forth. Among the aforementioned models a sequential variant of LSI (Krivenko and Vasilyev, 2009) is the approach most similar to ours in terms of problem formulation.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Statistical approaches use only the given textual context without any additional meta-information. Various models have been already investigated in this direction: topic modeling (Prabhakaran et al, 2016;Uban et al, 2021;Krivenko and Vasilyev, 2009), clustering approaches (Mei and Zhai, 2005;Behpour et al, 2021), and so forth. Among the aforementioned models a sequential variant of LSI (Krivenko and Vasilyev, 2009) is the approach most similar to ours in terms of problem formulation.…”
Section: Related Workmentioning
confidence: 99%
“…Various models have been already investigated in this direction: topic modeling (Prabhakaran et al, 2016;Uban et al, 2021;Krivenko and Vasilyev, 2009), clustering approaches (Mei and Zhai, 2005;Behpour et al, 2021), and so forth. Among the aforementioned models a sequential variant of LSI (Krivenko and Vasilyev, 2009) is the approach most similar to ours in terms of problem formulation. Apart from that, other models utilize information from knowledge bases like the web (Roy et al, 2002) or citation graphs (Erten et al, 2004;Chang and Blei, 2010).…”
Section: Related Workmentioning
confidence: 99%
“…At first all terms within the text documents can be built. Once a term by document matrix is made, SVD is employed to decompose the term by document matrix to construct a linguistics vector space which can be used totrepresenttconceptualttermdocumenttassociation [25].  Principle component Analysis: PCA it is a successful and important technique as it has the ability to reduce the size of data via converted the main attribute space into smaller spaces.…”
Section:  Latent Semantic Indexing Technique: Latentmentioning
confidence: 99%
“…Latent semantic Analysis uses singular value decomposition (SVD) technique in which large term-document matrix is changed into a set of k orthogonal factors so that the unique textual data is converted to a reduced semantic space. Also, new document vectors and new coordinates of the queries are calculated and compared in smaller k-dimensional space [4].…”
Section: Introductionmentioning
confidence: 99%