2023
DOI: 10.1016/j.patcog.2022.109259
|View full text |Cite
|
Sign up to set email alerts
|

TETFN: A text enhanced transformer fusion network for multimodal sentiment analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 80 publications
(7 citation statements)
references
References 11 publications
0
1
0
Order By: Relevance
“…In text-image person re-identification, Yan et al 29 launched a fine-grained information excavation framework which is driven by contrastive language-image pretraining. As for multimodal sentiment analysis, Wang et al 30 proposed the text enhanced transformer fusion network which can achieve effective unified multimodal representations.…”
Section: Multi-modal Joint Decisionmentioning
confidence: 99%
“…In text-image person re-identification, Yan et al 29 launched a fine-grained information excavation framework which is driven by contrastive language-image pretraining. As for multimodal sentiment analysis, Wang et al 30 proposed the text enhanced transformer fusion network which can achieve effective unified multimodal representations.…”
Section: Multi-modal Joint Decisionmentioning
confidence: 99%
“…More recently, Wang et al [52] proposed a novel Text Enhanced Transformer Fusion Network (TETFN) method that learns text-oriented pairwise cross-modal mappings to obtain effective unified multi-modal representations. Yang et al [53] applied BERT to translate visual and audio features into text features to enhance the quality of both visual and audio features.…”
Section: Multi-modal Sentiment Analysismentioning
confidence: 99%
“…TETFN [26] provided a text-enhanced transformer fusion network that captured the relevance between text and image through bidirectional attention mechanisms and enhanced the semantic representations of images using textual features.…”
Section: Evaluation Metricsmentioning
confidence: 99%