2021
DOI: 10.48550/arxiv.2103.15792
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Affect Analysis in-the-wild: Valence-Arousal, Expressions, Action Units and a Unified Framework

Dimitrios Kollias,
Stefanos Zafeiriou

Abstract: Affect recognition based on subjects' facial expressions has been a topic of major research in the attempt to generate machines that can understand the way subjects feel, act and react. In the past, due to the unavailability of large amounts of data captured in real-life situations, research has mainly focused on controlled environments. However, recently, social media and platforms have been widely used. Moreover, deep learning has emerged as a means to solve visual analysis and recognition problems. This pap… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
57
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
7
2
1

Relationship

1
9

Authors

Journals

citations
Cited by 46 publications
(57 citation statements)
references
References 29 publications
0
57
0
Order By: Relevance
“…They also proposed multitask learning models that employ both visual and audio modalities and suggested to use the ArcFace loss [6] for expression recognition. In an additional work, Kollias et al [21] [18] studied the problem of non-overlapping annotations in multitask learning datasets. They explored task-relatedness and proposed a novel distribution matching approach, in which knowledge exchange is enabled between tasks, via matching of their predictions' distributions.…”
Section: Related Workmentioning
confidence: 99%
“…They also proposed multitask learning models that employ both visual and audio modalities and suggested to use the ArcFace loss [6] for expression recognition. In an additional work, Kollias et al [21] [18] studied the problem of non-overlapping annotations in multitask learning datasets. They explored task-relatedness and proposed a novel distribution matching approach, in which knowledge exchange is enabled between tasks, via matching of their predictions' distributions.…”
Section: Related Workmentioning
confidence: 99%
“…They demonstrate that multimodal deep learning affect models can significantly improve affect detection in the wild. Finally, Kolias and Zafeiriou [34] propose a unified framework for affect modeling in the wild that considers facial expressions and categorical affect, facial action units, and dimensional affect representations.…”
Section: B Affect Modeling In the Wildmentioning
confidence: 99%
“…However, there are many obstacles for HCI systems used in real-world applications, such as in-the-wild or real-time tasks. To address these problems, Kollias et al have been hosting the Affective Behavior Analysis in-the-wild (ABAW) Competition, which involves a variety of research activites, for two years [5,6,7,8,9,11,12,28]. Most of the top-ranked teams in the first challenge of ABAW (ABAW1) [6], held in conjunction with the 15 th IEEE Conference on Face and Gesture Recognition (FG2020), used convolutional neural networks (CNNs) with single facial images or sequences of such images.…”
Section: Introductionmentioning
confidence: 99%