2023
DOI: 10.1007/s11063-023-11176-6
|View full text |Cite
|
Sign up to set email alerts
|

Improved Recurrent Neural Networks for Text Classification and Dynamic Sylvester Equation Solving

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 47 publications
0
4
0
Order By: Relevance
“…The tests based on average text length, number of subject areas, and using related and non-related subject areas in test groups. In average 28-34 tests per group were using all five different number of subject areas (4,6,10,14,20) and each test has his own pair with using nonrelated subject areas. Totally was created 12 test groups.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The tests based on average text length, number of subject areas, and using related and non-related subject areas in test groups. In average 28-34 tests per group were using all five different number of subject areas (4,6,10,14,20) and each test has his own pair with using nonrelated subject areas. Totally was created 12 test groups.…”
Section: Methodsmentioning
confidence: 99%
“…Similar calculations were found in sphere of security for cryptosystems, which also use adjacency metrics for preventing side channels attacks [9]. Subsequently, this matrix is sent to a recurrent neural network, which returns a vector of values for each of the streams, which determines the level of contiguity of the subject area with the user's request [4]. As a result of the search in each of the subject areas, the streams form a list of the most relevant data in the form of a list of terms with a coefficient of adjacency to the user's request.…”
Section: Fig 6 Chatbot Client-server Architecturementioning
confidence: 95%
“…Similar models are used in [33,34,35]. The authors in [36] propose enhancing RNN models by modifying the activation functions used.…”
Section: Related Workmentioning
confidence: 99%
“…Recently, some researchers have also proposed robust and noise-tolerant ZNN models with applications to dynamic complex matrix equation solving [21,22] and mobile manipulator path tracking [23,24]. Additionally, improved recurrent neural networks have been proposed for text classification and dynamic Sylvester equation solving [25].…”
Section: Introductionmentioning
confidence: 99%