2020
DOI: 10.1186/s40537-020-00364-z
|View full text |Cite
|
Sign up to set email alerts
|

Argument annotation and analysis using deep learning with attention mechanism in Bahasa Indonesia

Abstract: Argumentation mining is a research field which focuses on sentences in type of argumentation. Argumentative sentences are often used in daily communication and have important role in each decision or conclusion making process. The research objective is to do observation in deep learning utilization combined with attention mechanism for argument annotation and analysis. Argument annotation is argument component classification from certain discourse to several classes. Classes include major claim, claim, premise… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 35 publications
0
3
0
Order By: Relevance
“…Several argument models have been proposed in the literature, among which the traditional premise-claim model with support-attack relations stands out [4,31,32].…”
Section: Argument Modelmentioning
confidence: 99%
“…Several argument models have been proposed in the literature, among which the traditional premise-claim model with support-attack relations stands out [4,31,32].…”
Section: Argument Modelmentioning
confidence: 99%
“…Among the AM systems that use neural attention, the one used in [29] integrate hierarchical attention and biGRU for the analysis of the quality of the argument, the one in [30] use attention to integrate sentiment lexicon, while in other works [31], [32], [33] attention modules are stacked on top of recurrent layers. The use of Pointer Networks for AM has also been investigated [34].…”
Section: B Neural Attention For Ammentioning
confidence: 99%
“…Among the AM systems that use neural attention, the one used in [18] integrate hierarchical attention and biGRU for the analysis of the quality of the argument, the one in [19] use attention to integrate sentiment lexicon, while in other works [20]- [22] attention modules are stacked on top of recurrent layers. The use of Pointer Networks for AM has also been investigated [23].…”
Section: B Neural Attention For Ammentioning
confidence: 99%