2017
DOI: 10.1016/j.procs.2017.10.036
|View full text |Cite
|
Sign up to set email alerts
|

It Takes Two To Tango: Modification of Siamese Long Short Term Memory Network with Attention Mechanism in Recognizing Argumentative Relations in Persuasive Essay

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0
1

Year Published

2017
2017
2021
2021

Publication Types

Select...
3
3
2

Relationship

2
6

Authors

Journals

citations
Cited by 10 publications
(7 citation statements)
references
References 8 publications
0
6
0
1
Order By: Relevance
“…In the continuous attempt to limit the number of handcrafted features and corpus-specific knowledge, research has mainly shifted towards Deep Learning. The most common architecture is bidirectional Long-Short Term Memory (LSTM) Neural Networks fed with word embeddings [23,24,7]. In general, Deep Learning frameworks tend to give state-of-the-art results, which approach human performance.…”
Section: Related Work On Argument Identificationmentioning
confidence: 99%
See 1 more Smart Citation
“…In the continuous attempt to limit the number of handcrafted features and corpus-specific knowledge, research has mainly shifted towards Deep Learning. The most common architecture is bidirectional Long-Short Term Memory (LSTM) Neural Networks fed with word embeddings [23,24,7]. In general, Deep Learning frameworks tend to give state-of-the-art results, which approach human performance.…”
Section: Related Work On Argument Identificationmentioning
confidence: 99%
“…Therefore, we aim to address those challenges of AM by using a novel Transfer Learning solution that is inter-domain applicable and does not require any labor-intensive NLP features. Current solutions in AM with unsupervised learning (e.g., [6]) or classification approaches using embedding structures and neural networks ( [7], [8]) fall short of solving those issues, since they are either not generalizable or very domain-specific. Hence, we aim to contribute to literature and practice by presenting a novel solution that works on a Deep Learning model architecture and enables future scientists and researchers to build AM pipelines without intensive effort.…”
Section: Introductionmentioning
confidence: 99%
“…It was done by observing either the relation was sufficient or not [11]. Long Short Term Memory (LSTM) as one of promising deep learning method for text was modified involving Siamese network to recognize argumentation relation in persuasive essay [40]. Furthermore, Hierarchical Attention Network (HAN) with XGBoost was utilized to similar task and indicated to be a promising method for hierarchical data [41].…”
Section: Argument Analysismentioning
confidence: 99%
“…Na literatura são encontrados alguns trabalhos para a mineração de argumentos [Desilia et al 2017;Gema et al 2017…”
Section: Trabalhos Relacionadosunclassified