2023
DOI: 10.1016/j.eswa.2022.118695
|View full text |Cite
|
Sign up to set email alerts
|

An integrated latent Dirichlet allocation and Word2vec method for generating the topic evolution of mental models from global to local

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0
1

Year Published

2023
2023
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 28 publications
(10 citation statements)
references
References 32 publications
0
9
0
1
Order By: Relevance
“…In this study, we perform semantic relevance visualization analysis of textual information, using two methods, Word2Vec and TF-IDF, respectively. Word2Vec learns the vector representations of words in a high-dimensional vector space and calculates the cosine distances between words to show their relevance 26 . First, the text words are converted to 100-dimensional word vectors using Word2Vec, and then the high-dimensional vectors are reduced to 2 dimensions by PCA and finally plotted as a 2D image.…”
Section: Methodsmentioning
confidence: 99%
“…In this study, we perform semantic relevance visualization analysis of textual information, using two methods, Word2Vec and TF-IDF, respectively. Word2Vec learns the vector representations of words in a high-dimensional vector space and calculates the cosine distances between words to show their relevance 26 . First, the text words are converted to 100-dimensional word vectors using Word2Vec, and then the high-dimensional vectors are reduced to 2 dimensions by PCA and finally plotted as a 2D image.…”
Section: Methodsmentioning
confidence: 99%
“…Firstly, the Word2Vec model is conducted to construct the word vector space of paper and patent data. As a word vector tool, the Word2Vec model can not only map words from highdimensional space to low-dimensional space, but also retain the existing position between word vectors to solve the problem of vector sparseness and semantic connection (Ma et al, 2023). Secondly, the cosine similarity between the vectors corresponding to the high-value paper topics and the patent topics is calculated to obtain the topic similarity matrix.…”
Section: Selecting Patent Topicsmentioning
confidence: 99%
“…Thus, Kim et al (2020) used Word2Vec in combination with spherical k-means clustering to reveal the crypto industry trends. Word2Vec was also used by Gao et al (2022) and Ma et al (2023). Amado et al (2018) created dictionaries of words by category in a semiautomated manner and clustered them via a document-term matrix and latent Dirichlet allocation (LDA) algorithm.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Amado et al (2018) created dictionaries of words by category in a semiautomated manner and clustered them via a document-term matrix and latent Dirichlet allocation (LDA) algorithm. Mohammadi and Karami (2022), Porter (2018), Sharma and Sharma (2022), Kukreja (2022) and Ma et al (2023) also used the LDA approach to cluster terms; Te Liew et al (2014) applied the n-gram extraction through the NLTK library and examined the list of top sustainability-related terms based on TF–IDF statistics (TF – term frequency, IDF – inverse document frequency).…”
Section: Literature Reviewmentioning
confidence: 99%