2018
DOI: 10.1007/978-3-319-76941-7_44
|View full text |Cite
|
Sign up to set email alerts
|

Co-training for Extraction of Adverse Drug Reaction Mentions from Tweets

Abstract: Adverse drug reactions (ADRs) are one of the leading causes of mortality in health care. Current ADR surveillance systems are often associated with a substantial time lag before such events are officially published. On the other hand, online social media such as Twitter contain information about ADR events in real-time, much before any official reporting. Current state-of-the-art methods in ADR mention extraction use Recurrent Neural Networks (RNN), which typically need large labeled corpora. Towards this end,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(8 citation statements)
references
References 14 publications
0
8
0
Order By: Relevance
“…Multi-head self-attention with various features [38] has some advantages over CNN, CRNN and CNN with an attention mechanism on ADR tweet classification. Transfer learning [8], co-training [9] and multi-task learning [39] are adopted to extract ADRs, classify tweets mentioning ADRs and normalize ADRs concept, and multitask learning achieves the state-of-the-art result. With BERT performing well in many NLP tasks, researchers introduce the knowledge base and conditional random field (CRF) into BERT for the automatic classification of ADRs (text classification) and extraction of ADRs (NER) on SMM4H Shared Task 2019 [19], respectively.…”
Section: B Automatic Adr Detection From Social Textsmentioning
confidence: 99%
See 1 more Smart Citation
“…Multi-head self-attention with various features [38] has some advantages over CNN, CRNN and CNN with an attention mechanism on ADR tweet classification. Transfer learning [8], co-training [9] and multi-task learning [39] are adopted to extract ADRs, classify tweets mentioning ADRs and normalize ADRs concept, and multitask learning achieves the state-of-the-art result. With BERT performing well in many NLP tasks, researchers introduce the knowledge base and conditional random field (CRF) into BERT for the automatic classification of ADRs (text classification) and extraction of ADRs (NER) on SMM4H Shared Task 2019 [19], respectively.…”
Section: B Automatic Adr Detection From Social Textsmentioning
confidence: 99%
“…Text mining and partially supervised learning methods [3] are integrated to classify ADR (positive instances) and non-ADR messages (negative instances), and researchers employ various features such as word embedding [4], position feature [5] and medical knowledge [6] to promote the whole performance of their methods. Moreover, researchers utilize attention mechanisms [7], transfer learning [8], co-training learning [9], broad learning [10] and multitask learning [11] to learn these deep dominant features [12]. Medical resources and emotional score are merged into features that represent the semantic meaning of the text segments of different methods.…”
Section: Introductionmentioning
confidence: 99%
“…Huynh T. et al applied convolutional recurrent neural network (CRNN), obtained by concatenating CNN with a recurrent neural network (RNN) and CNN with the additional weights (Huynh et al, 2016). Gupta S. et al utilized a semi-supervised method based on co-training (Gupta et al, 2018a). Chowdhury et al proposed a multi-task neural network framework that in addition to ADR classification learns extract ADR mentions (Chowdhury et al, 2018).…”
Section: Related Workmentioning
confidence: 99%
“…Modern approaches for the extracting of ADRs are based on neural networks. Saldana adopted CNN for the detection of ADR relevant sentences (Miranda, 2018 (Gupta et al, 2018a). Chowdhury et al proposed a multi-task neural network framework that in addition to ADR classification learns extract ADR mentions (Chowdhury et al, 2018).…”
Section: Related Workmentioning
confidence: 99%
“…Feature extraction from unstructured textual data in Natural Language Processing (NLP) developed many approaches for text mining; such as, the probabilistic approach of Bag-Of-Words (BOW) with Machine Learning (ML), the distributed representation of words approach (word embeddings), like Word2vec algorithm applied with Deep Learning (DL) [8], [9], and transfer learning, like Bidirectional Encoder Representations from Transformers (BERT) [10] and XLNet [11] techniques. The BERT model is designed to pre-train deep bidirectional representations of unlabeled text by co-conditioning both left and right context in all layers.…”
Section: Introductionmentioning
confidence: 99%