2021
DOI: 10.1016/j.procs.2021.05.088
|View full text |Cite
|
Sign up to set email alerts
|

Attention based Abstractive Summarization of Malayalam Document

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(2 citation statements)
references
References 8 publications
0
2
0
Order By: Relevance
“…Using their proposed method, they were able to generate a clear and concise summary. (Nambiar et al, 2021) proposed abstractive text summarization of Malayalam text. They used the Seq2Seq model with an attention mechanism for summarizing Malayalam text.…”
Section: Summarization Of Indian Languagesmentioning
confidence: 99%
“…Using their proposed method, they were able to generate a clear and concise summary. (Nambiar et al, 2021) proposed abstractive text summarization of Malayalam text. They used the Seq2Seq model with an attention mechanism for summarizing Malayalam text.…”
Section: Summarization Of Indian Languagesmentioning
confidence: 99%
“…The peculiar complexity of the Malayalam language, such as the frequent usage of linking verbs or copulas in sentences and the lack of predicate agreements between subject and verbs about person, gender, and number, are highlighted in the study [9]. The classic S2S attention model is trained using data created by translating Malayalam versions of freely downloadable BBC news corpus.…”
Section: Literature Surveymentioning
confidence: 99%