2020 International Conference on Smart Electronics and Communication (ICOSEC) 2020
DOI: 10.1109/icosec49089.2020.9215319
|View full text |Cite
|
Sign up to set email alerts
|

Review on Word2Vec Word Embedding Neural Net

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0
1

Year Published

2022
2022
2024
2024

Publication Types

Select...
8
2

Relationship

0
10

Authors

Journals

citations
Cited by 40 publications
(14 citation statements)
references
References 18 publications
0
13
0
1
Order By: Relevance
“…Word2Vec is a NLP system that uses neural networks in order to create a distributed word representations in a corpus ( Sivakumar et al, 2020 ). Word2Vec embeddings module was implemented in the Python library Gensim ( ) to train word vectors of the pre-processed text.…”
Section: Methodsmentioning
confidence: 99%
“…Word2Vec is a NLP system that uses neural networks in order to create a distributed word representations in a corpus ( Sivakumar et al, 2020 ). Word2Vec embeddings module was implemented in the Python library Gensim ( ) to train word vectors of the pre-processed text.…”
Section: Methodsmentioning
confidence: 99%
“…To construct word vectors, we use the word2vec method in our research. To minimize the complexity of the process, word2vec uses two language models, The continuous bag of words (CBOW) and skip-gram, to learn distributed word representations (Sivakumar et al , 2020). The CBOW model's mechanism is to predict the core word's probability of occurrence based on the contextual words.…”
Section: Methodsmentioning
confidence: 99%
“…It is used in different applications, such as sentiment classification, named-entity recognition (NER), POS-tagging, and document analysis. Word2Vec has two techniques: skip gram (SG) and continuous bag of words (CBOW) [64]. In this study, CBOW was used after conducting exhaust experiments, which concluded that CBOW outperformed the SG.…”
Section: Preparing Data For Word Embedding/text Representationmentioning
confidence: 99%