2022
DOI: 10.1007/s00521-022-07139-y
|View full text |Cite
|
Sign up to set email alerts
|

Learning multi-level representations for affective image recognition

Abstract: Images can convey intense affective experiences and affect people on an affective level. With the prevalence of online pictures and videos, evaluating emotions from visual content has attracted considerable attention. Affective image recognition aims to classify the emotions conveyed by digital images automatically. The existing studies using manual features or deep networks mainly focus on low-level visual features or high-level semantic representation without considering all factors. To better understand how… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 13 publications
(9 citation statements)
references
References 49 publications
0
9
0
Order By: Relevance
“…Unlike their method of fine-tuning existing deep architectures, You et al [11] design a progressively learned CNN to classify image sentiment. Some researchers also combine different hierarchical features in deep models for sentiment classification, from global to local perspectives [12,13] . It is also worth mentioning that in the recent work of You et al [14] , AR [5] , and R-CNNGSR [6] , information from different image regions is used for image sentiment analysis.…”
Section: Deep Representations For Emotion Recognitionmentioning
confidence: 99%
See 3 more Smart Citations
“…Unlike their method of fine-tuning existing deep architectures, You et al [11] design a progressively learned CNN to classify image sentiment. Some researchers also combine different hierarchical features in deep models for sentiment classification, from global to local perspectives [12,13] . It is also worth mentioning that in the recent work of You et al [14] , AR [5] , and R-CNNGSR [6] , information from different image regions is used for image sentiment analysis.…”
Section: Deep Representations For Emotion Recognitionmentioning
confidence: 99%
“…ArtPhoto Abstract Twitter I EmotionROI 5-agree 4-agree 3-agree SentiBank [27] 67.74 64.95 71.32 68.28 66.63 66.18 PAEF [28] 67.85 70.05 72.90 69.61 67.92 75.24 DeepSentiBank [30] 68.73 71.19 76.35 70.15 71.25 70.11 PCNN [11] 70.96 70.84 82.54 76.52 76.36 73.58 VGG-16 w/o Fine-tuning [32] 67.61 68.86 83.44 78.67 75.49 72.25 VGG-16 [32] 70.09 72.48 84.35 82.26 76.75 77.02 SentiNet-A [29] --85.10 80.70 77.7 -AR [5] 74.80 76.03 88.65 85.10 81.06 81.26 R-CNNGSR [6] 75.02 75.89 ---81.36 MLM w/o ๐ฟ ๐‘๐‘–๐‘  [13] 74.32 76.82 87.19 82.95 80.42 81.53 MLM [13] 75…”
Section: Modelmentioning
confidence: 99%
See 2 more Smart Citations
“…From 2005 to 2012, the same search yielded 591 results; from 2013 to 2021, the results of the same search had jumped to 3529. In that time, emotion classification has been the subject of analysis in diverse domains including image recognition [1], animation [2], root cause diagnosis [3], online reviews [4], and social network analysis [5][6][7]. Twitter, in particular, has become a lightning rod for researchers aiming to model human language through various machine learning techniques.…”
Section: Introductionmentioning
confidence: 99%