The 23rd International Conference on Information Integration and Web Intelligence 2021
DOI: 10.1145/3487664.3487788
|View full text |Cite
|
Sign up to set email alerts
|

Analysis of GraphSum’s Attention Weights to Improve the Explainability of Multi-Document Summarization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 13 publications
0
1
0
Order By: Relevance
“…Humphreys et al proposed an architecture for predicting defects, using the sum of attention weights over all layers [76]. Attention weights were employed for searching in a transformer-based model dedicated to multi-document summarization in [77].…”
Section: Attention-based Methodsmentioning
confidence: 99%
“…Humphreys et al proposed an architecture for predicting defects, using the sum of attention weights over all layers [76]. Attention weights were employed for searching in a transformer-based model dedicated to multi-document summarization in [77].…”
Section: Attention-based Methodsmentioning
confidence: 99%