2019
DOI: 10.1109/access.2019.2911983
|View full text |Cite
|
Sign up to set email alerts
|

Part-of-Speech-Based Long Short-Term Memory Network for Learning Sentence Representations

Abstract: Sentence representations play an important role in the field of natural language processing. While word representation has been applied to many natural language processing tasks, sentence representation has not been applied as widely due to the more complex structure and richer syntactic information of sentences. To learn sentence representations with structural and syntactic information, we propose a new model called the part-of-speech-based Long Short-Term Memory network (pos-LSTM) model. The pos-LSTM model … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 12 publications
(6 citation statements)
references
References 8 publications
0
6
0
Order By: Relevance
“…ey use the similarity between users to find the set of nearest neighbors, calculate user preferences based on the weighted average of the neighbor sets, and filter product searches to users according to their scores. Later, with the development of deep learning, in addition to its power in mining hidden features, more features that express user preferences can be discovered through deep mining, making the user preference model more accurate in describing user preferences [12][13][14].…”
Section: Related Workmentioning
confidence: 99%
“…ey use the similarity between users to find the set of nearest neighbors, calculate user preferences based on the weighted average of the neighbor sets, and filter product searches to users according to their scores. Later, with the development of deep learning, in addition to its power in mining hidden features, more features that express user preferences can be discovered through deep mining, making the user preference model more accurate in describing user preferences [12][13][14].…”
Section: Related Workmentioning
confidence: 99%
“…Task-Driven FC Teaching Model Analysis. Different from the traditional teaching mode, FC teaching mode based on task-driven takes teaching content analysis and partition as the starting point of teaching activities, divides the teaching content into several different task stages according to the logical relationship, and sets up several levels of increasing difficulty for each stage according to students' learning level and learning needs [22,23]. The task-driven FC teaching model is shown in Figure 3.…”
Section: Fcmentioning
confidence: 99%
“…Zhu et al [ 16 ] incorporated lexical information into the learning of sentence representations. They used a classical network structure like LSTM [ 17 ] to model the sequence of words in a sentence to obtain a sequence of hidden states, where each hidden state corresponds to a word in the input sentence.…”
Section: Related Work: Nlp Using Part-of-speech Informationmentioning
confidence: 99%