2019
DOI: 10.1007/978-3-030-35288-2_25
|View full text |Cite
|
Sign up to set email alerts
|

Detecting Depression in Dyadic Conversations with Multimodal Narratives and Visualizations

Abstract: Conversations contain a wide spectrum of multimodal information that gives us hints about the emotions and moods of the speaker. In this paper, we developed a system that supports humans to analyze conversations. Our main contribution is the identification of appropriate multimodal features and the integration of such features into verbatim conversation transcripts. We demonstrate the ability of our system to take in a wide range of multimodal information and automatically generated a prediction score for the … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
8
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 7 publications
(8 citation statements)
references
References 19 publications
0
8
0
Order By: Relevance
“…The "Multimodal Narratives for Human" (MONAH) system [6,7], is an automated nonverbal annotation system. The system takes in one video per speaker and runs a pipeline of data processing steps.…”
Section: Automated Visualizations Of Human-human Conversationsmentioning
confidence: 99%
See 3 more Smart Citations
“…The "Multimodal Narratives for Human" (MONAH) system [6,7], is an automated nonverbal annotation system. The system takes in one video per speaker and runs a pipeline of data processing steps.…”
Section: Automated Visualizations Of Human-human Conversationsmentioning
confidence: 99%
“…Compared to the Jefferson transcript, a MONAH transcript appears to be easier to read because it does not involve technical symbols. The authors found that the multimodal annotations improved the performance of supervised learning [6,7]. However, there was no mention of whether the annotations improved the users' understanding of the conversation.…”
Section: Automated Visualizations Of Human-human Conversationsmentioning
confidence: 99%
See 2 more Smart Citations
“…In 2014, Cambridge Analytica harvested data from around 200,000 Facebook accounts to build psychological profiles of 87 million users in an attempt to "target their inner demons" and sway their voting behaviors (Cadwalladr and Graham-Harrison 2018). Advances in machine learning and big data have now enabled the use of heretofore low-risk data, like what one posts publicly on Twitter, to be used to predict likelihood of medical conditions (Schneble et al 2020), contact with a pathogen (Keeling et al 2020), or mental illness (Kim et al 2019). Further, constant surveillance (empowered in part by mobile devices and social media) has enabled governments and corporations to develop social management infrastructures to control citizens', employees', and consumers' behaviors through social credit, targeted influence, or fear of discovery (Kostka 2019;West 2019).…”
mentioning
confidence: 99%