Findings of the Association for Computational Linguistics: EMNLP 2020 2020
DOI: 10.18653/v1/2020.findings-emnlp.65
|View full text |Cite
|
Sign up to set email alerts
|

Rethinking Self-Attention: Towards Interpretability in Neural Parsing

Abstract: Attention mechanisms have improved the performance of NLP tasks while allowing models to remain explainable. Self-attention is currently widely used, however interpretability is difficult due to the numerous attention distributions. Recent work has shown that model representations can benefit from label-specific information, while facilitating interpretation of predictions. We introduce the Label Attention Layer: a new form of self-attention where attention heads represent labels. We test our novel layer by ru… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
48
1

Year Published

2021
2021
2022
2022

Publication Types

Select...
4
3
3

Relationship

1
9

Authors

Journals

citations
Cited by 70 publications
(49 citation statements)
references
References 35 publications
0
48
1
Order By: Relevance
“…We use the same setting as in Section 4. Mrini et al (2020) is not identical to the dataset in previous work such as Zhang et al (2020) and Wang and Tu (2020). ‡ : For reference, we confirmed with the authors of He and Choi (2020) that they used a different data pre-processing script with previous work.…”
Section: Comparison With Embedding Weighting and Ensemble Approachesmentioning
confidence: 97%
“…We use the same setting as in Section 4. Mrini et al (2020) is not identical to the dataset in previous work such as Zhang et al (2020) and Wang and Tu (2020). ‡ : For reference, we confirmed with the authors of He and Choi (2020) that they used a different data pre-processing script with previous work.…”
Section: Comparison With Embedding Weighting and Ensemble Approachesmentioning
confidence: 97%
“…We extract instances of clarifications from a resource of revision edits called wikiHowToImprove . Specifically, we used a state-of-the-art a constituency parser (Mrini et al, 2020) to preprocess all revisions from wikiHow- ToImprove and applied a set of rule-based filters to identify specific types of edits (see Table 2).…”
Section: Data Collectionmentioning
confidence: 99%
“…several times alredy i found my self driving in the middle of the crossing in red light luckily at the moment no fines. hehehe :) pykester bank (PTB) (Marcus et al, 1994), and the Englishlanguage parser of Mrini et al (2020), which is the state of the art on the parse trees of the PTB.…”
Section: Setupmentioning
confidence: 99%