2020
DOI: 10.1016/j.jbi.2020.103609
|View full text |Cite
|
Sign up to set email alerts
|

Character level and word level embedding with bidirectional LSTM – Dynamic recurrent neural network for biomedical named entity recognition from literature

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

1
22
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
2
1

Relationship

1
9

Authors

Journals

citations
Cited by 43 publications
(24 citation statements)
references
References 30 publications
1
22
0
1
Order By: Relevance
“…The extracted feature vectors are fed into the Bi-LSTM in both forward and backward LSTMs to learn the OD tolerance of decomposed shaft data in sequence. The Bi-LSTM cells cross verify the vectors against each other to ensure that the correct vector values are fed into the networks [31]. The CRF layer jointly decodes the best chain of labeled OD tolerance zones for a given shaft part.…”
Section: Cnn-bilstm-crf For Od Tolerancementioning
confidence: 99%
“…The extracted feature vectors are fed into the Bi-LSTM in both forward and backward LSTMs to learn the OD tolerance of decomposed shaft data in sequence. The Bi-LSTM cells cross verify the vectors against each other to ensure that the correct vector values are fed into the networks [31]. The CRF layer jointly decodes the best chain of labeled OD tolerance zones for a given shaft part.…”
Section: Cnn-bilstm-crf For Od Tolerancementioning
confidence: 99%
“…In the early time, many researches mainly focused on the traditional rule methods [4][5]. In recent years, with the development and maturity of artificial intelligence technology, many artificial intelligence technologies have been applied to the named entities and relations recognition, which include maximum entropy Markov model(MEMM) [6][7], conditional random field(CRF) [8][9][10], convolutional neural network(CNN) [11][12][13] , recurrent neural network(RNN) [14][15][16] and its improvement long short term memory(LSTM) model [17][18][19], as well as LSTM-CRF [20][21] model based on the combination of LSTM and conditional random field, etc.…”
Section: Introductionmentioning
confidence: 99%
“…We organized the differences into 3 main points. First, the RNA sequences are a combination of multiple meaningful and meaningless sequences, where the meaningful units are embedded into the entire background sequences, not like the words that form a certain grammatical structure in order [16] . An RNA typically has a large variety of functions enabled by meaningful units, such as the ability to form high-level structures and to recruit other components [17] .…”
Section: Introductionmentioning
confidence: 99%