2020 IEEE 5th Information Technology and Mechatronics Engineering Conference (ITOEC) 2020
DOI: 10.1109/itoec49072.2020.9141919
|View full text |Cite
|
Sign up to set email alerts
|

Generating summary using sequence to sequence model

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 5 publications
0
3
0
Order By: Relevance
“…While the extractive method weights and selects sentences from the input texts based on ranking criteria. The latter method is popular (i.e., especially in news domain [23]) and seems a more straightforward method, which tends to produce a higher efficiency summary than the abstractive-based summary [24]. Diving into the extractive method that we consider in this paper, there are various techniques used in addressing Arabic summarization, such as statistical/linguistic-based [8], [15], [9], [6], semantic/query-based [12], [7], [8], [21], [9], graph/optimization-based [10], [11], [25], [12], and machine learning [5], [13], [14], [20], [19], [15], [26], [27], [28].…”
Section: A Overview Of Ats Approachesmentioning
confidence: 99%
“…While the extractive method weights and selects sentences from the input texts based on ranking criteria. The latter method is popular (i.e., especially in news domain [23]) and seems a more straightforward method, which tends to produce a higher efficiency summary than the abstractive-based summary [24]. Diving into the extractive method that we consider in this paper, there are various techniques used in addressing Arabic summarization, such as statistical/linguistic-based [8], [15], [9], [6], semantic/query-based [12], [7], [8], [21], [9], graph/optimization-based [10], [11], [25], [12], and machine learning [5], [13], [14], [20], [19], [15], [26], [27], [28].…”
Section: A Overview Of Ats Approachesmentioning
confidence: 99%
“…In this section, we first present the basic theoretical framework of standard GAN in Part A, the characteristics of real seismic sequences are analyzed in Part B according to the evolution process of the seismic event. Finally, we also design the EQGAN model based on appropriate algorithms in Part C. Earthquake/Phase detection ConvNet incomplete/low SNR waveform Using transformation method [25] Eaethquake detection Unsupervised technique Microseismic dataset Using transformation method [26] Earthquake detection SCALODEEP Small training dataset Using generalizad deep learning based on a small daraset [27] Earthquake detection CPIC Small-sized dataset Using generalizad deep learning based on a small daraset [28] Earthquake detection CapsNet Small training dataset Using generalizad deep learning based on a small daraset [31] Synthesize speech HMM Speech dataset Developing a data augmentation approach [32] Text generation LSTM Text dataset Developing a data augmentation approach [33] Text summarization Seq2seq Text dataset Developing a data augmentation approach [29] Seismic data augmentation Conditional GAN Seismic dataset Developing a data augmentation approach [30] Short seismic waveform generation EarthuqakeGen Seismic dataset Developing a data augmentation approach A. THEORETICAL BASIS In 2014, Goodfellow [36] proposed the concept of GAN, an epoch-making unsupervised learning algorithm framework (Fig.…”
Section: Theory and Model Designmentioning
confidence: 99%
“…However, compared with the traditional recurrent neural network (RNN) algorithm, the training difficulty also increases because of too many parameters. Zhao et al [33] proposes the seq2seq based on LSTM together with an attention mechanism to improve the efficiency quality of text summarization, which, however, lack text information coherence. Although different DL models have been developed in these studies to achieve time series generation, they cannot fully represent the distribution of the original data.…”
Section: Introductionmentioning
confidence: 99%