2021
DOI: 10.1016/j.invent.2021.100422
|View full text |Cite
|
Sign up to set email alerts
|

Automatic identification of suicide notes with a transformer-based deep learning model

Abstract: Suicide is one of the leading causes of death worldwide. At the same time, the widespread use of social media has led to an increase in people posting their suicide notes online. Therefore, designing a learning model that can aid the detection of suicide notes online is of great importance. However, current methods cannot capture both local and global semantic features. In this paper, we propose a transformer-based model named TransformerRNN, which can effectively extract contextual and long-term dependency in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
12
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 33 publications
(12 citation statements)
references
References 35 publications
0
12
0
Order By: Relevance
“… LSTM or GRU with multiple instance learning 145 , 146 Using multiple instance learning to get the possibility of post-level labels and improve the prediction of user-level labels. SISMO 139 An ordinal hierarchical LSTM attention model Transformer-based methods Self-attention models 148 , 149 Using the encoder structure of transformer which has self-attention module. BERT-based models (BERT 150 , 151 , DistilBERT 152 , RoBERTa 153 , ALBERT 150 , BioClinical BERT 31 , XLNET 154 , GPT-1 155 ) Different BERT-based pre-trained models.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“… LSTM or GRU with multiple instance learning 145 , 146 Using multiple instance learning to get the possibility of post-level labels and improve the prediction of user-level labels. SISMO 139 An ordinal hierarchical LSTM attention model Transformer-based methods Self-attention models 148 , 149 Using the encoder structure of transformer which has self-attention module. BERT-based models (BERT 150 , 151 , DistilBERT 152 , RoBERTa 153 , ALBERT 150 , BioClinical BERT 31 , XLNET 154 , GPT-1 155 ) Different BERT-based pre-trained models.…”
Section: Resultsmentioning
confidence: 99%
“…Wang et al proposed the C-Attention network 148 by using a transformer encoder block with multi-head self-attention and convolution processing. Zhang et al also presented their TransformerRNN with multi-head self-attention 149 . Additionally, many researchers leveraged transformer-based pre-trained language representation models, including BERT 150 , 151 , DistilBERT 152 , Roberta 153 , ALBERT 150 , BioClinical BERT for clinical notes 31 , XLNET 154 , and GPT model 155 .…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…A cross-sectional assessment of structural Magnetic Resonance Imaging (MRI) data with a support vector machine learning model for a sample of adolescents/young adults diagnosed with major depression disorder found a high-level of accuracy for predicting those with suicide ideation and/or attempts ( Hong et al, 2021 ). A transformer-based deep learning model sought suicide risk identification from social media sources – it was proposed as effective for classifying suicide notes (i.e., contextual and long-term dependency information determined from different datasets) ( Zhang et al, 2021 ). Machine learning helps analyze big data easier by automating processes – this helps support a better predictive potential of an individual’s suicide risk but there is yet to be accurate prediction of specific risks across populations.…”
Section: Current State Of the Art – Digital Tools And Technologymentioning
confidence: 99%
“…However, the corpus was later discontinued for usage due to privacy concerns. Other related corpora are either very small in size 42 , 43 or are not essentially developed from genuine suicide notes 44 , 45 . The Northern Ireland Suicide Study 42 comprises a compilation of data from a range of documentation sources, including coroners’ files, including suicide notes.…”
Section: Related Workmentioning
confidence: 99%