2020
DOI: 10.48550/arxiv.2004.14535
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Text Segmentation by Cross Segment Attention

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 0 publications
0
4
0
Order By: Relevance
“…These dynamic word representations, extracted from pre-trained models such as those presented in [3], [21]- [23], greatly outperformed their static predecessors in various NLP tasks [24]. In the field of topic segmentation, the introduction of dynamic word representa-tions has also produced results superior to those of previous approaches [25]- [27].…”
Section: Topic Segmentationmentioning
confidence: 99%
“…These dynamic word representations, extracted from pre-trained models such as those presented in [3], [21]- [23], greatly outperformed their static predecessors in various NLP tasks [24]. In the field of topic segmentation, the introduction of dynamic word representa-tions has also produced results superior to those of previous approaches [25]- [27].…”
Section: Topic Segmentationmentioning
confidence: 99%
“…Finally, a segment boundary is defined as an SRT block that exhibits both the maximum cosine similarity and surpasses a predefined similarity threshold. Furthermore, [12] introduces a cross-segment attention mechanism designed to identify significant boundaries within text through the capture of inter-segment relationships. This method, which considers context and connections between segments, shows promising outcomes in enhancing the accuracy and effectiveness of text segmentation.…”
Section: Video Segmentation Based On Text Algorithmsmentioning
confidence: 99%
“…SegBot [20] and [21] Sentence and document DisSim [22] Discourse sentence English Three BERT-style models [23] Discourse sentence and document…”
Section: Multilingualmentioning
confidence: 99%
“…Context-preserving approach [4] Simple sentence TopicDiff-LDA Latent Dirichlet Allocation the natural language understanding (NLU) approaches depending on artificial neural networks, particularly those adopted transformer-based models (e.g., [23]).…”
Section: Monolingualmentioning
confidence: 99%