2019 10th International Conference on Computing, Communication and Networking Technologies (ICCCNT) 2019
DOI: 10.1109/icccnt45670.2019.8944559
|View full text |Cite
|
Sign up to set email alerts
|

Part-of-Speech Tagger for Biomedical Domain Using Deep Neural Network Architecture

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
4
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 14 publications
(4 citation statements)
references
References 6 publications
0
4
0
Order By: Relevance
“…The first hidden layer processes the forward input sequences, while the other hidden layer processes it backward; both are then connected to the same output layer, which provides access to the future and past context of every point in the sequence. Hence BLSTM beat both standard LSTMs and RNNs, and it also significantly provides a faster and more accurate model [14,58].…”
Section: Bidirectional Long Short-term Memorymentioning
confidence: 98%
See 2 more Smart Citations
“…The first hidden layer processes the forward input sequences, while the other hidden layer processes it backward; both are then connected to the same output layer, which provides access to the future and past context of every point in the sequence. Hence BLSTM beat both standard LSTMs and RNNs, and it also significantly provides a faster and more accurate model [14,58].…”
Section: Bidirectional Long Short-term Memorymentioning
confidence: 98%
“…A Long Short-Term Memory (LSTM) is a special kind of RNN network architecture, which has the capability of learning long-term dependencies. An LSTM can also learn to fill the gap in time intervals in more than1000 steps [14,57,58].…”
Section: Long Short-term Memorymentioning
confidence: 99%
See 1 more Smart Citation
“…In terms of transformer models, the transfer learning ability improved the accuracy of many NLP applications [30,31]. The study by Gopalakrishnan et al [32] investigated the performance of LSTM and Gated Recurrent Unit (GRU) models on a biomedical dataset. The researchers found that the bi-directional versions of both LSTM and GRU models outperformed their respective simple models, showing superior results.…”
Section: Literature Reviewmentioning
confidence: 99%