2018
DOI: 10.1109/access.2018.2850284
|View full text |Cite
|
Sign up to set email alerts
|

Multimodal Egocentric Analysis of Focused Interactions

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
14
0

Year Published

2018
2018
2025
2025

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 17 publications
(14 citation statements)
references
References 33 publications
0
14
0
Order By: Relevance
“…In (Bolaños and Radeva, 2016) a two-step method is presented for food detection and recognition in lifelogging images. A recognition of personal locations in daily activities of the wearer is studied in (Furnari et al, 2017) while social interactions and lifestyle patterns are analyzed in (Bano et al, 2018) and (Herruzo et al, 2017) respectively. Visual lifelogging has also been used in ambient assisted living applications (Climent-Pérez et al, 2020) such as fall detection, monitoring, etc.…”
Section: Background 21 Visual Lifeloggingmentioning
confidence: 99%
See 1 more Smart Citation
“…In (Bolaños and Radeva, 2016) a two-step method is presented for food detection and recognition in lifelogging images. A recognition of personal locations in daily activities of the wearer is studied in (Furnari et al, 2017) while social interactions and lifestyle patterns are analyzed in (Bano et al, 2018) and (Herruzo et al, 2017) respectively. Visual lifelogging has also been used in ambient assisted living applications (Climent-Pérez et al, 2020) such as fall detection, monitoring, etc.…”
Section: Background 21 Visual Lifeloggingmentioning
confidence: 99%
“…The dataset was used for training systems to recognize hand actions. The abundance and availability of wearable cameras and smartphones have led to the creation of several first person datasets the last years (Bolaños et al, 2015) including datasets for object recognition (Bullock et al, 2015), activity recognition (Gurrin et al, 2016), social interaction analysis (Bano et al, 2018), etc. However, not all datasets are publicly available to the academic community.…”
Section: Egocentric Databasesmentioning
confidence: 99%
“…Recently, wearable cameras have enabled the automatic capture of social life in a naturalistic setting, from a first-person point of view [1]. This has opened the unique opportunity of analyzing the real involvement in social interactions at the personal level [2][3][4]. However, the research focus in the egocentric vision domain has been so far on the detection [3,4] and classification of social interactions based on the kind of relations [5,6], while the problem of fine-grained classification of a specific relation based on the degree of interactivity has been addressed only in [2].…”
Section: Introductionmentioning
confidence: 99%
“…This has opened the unique opportunity of analyzing the real involvement in social interactions at the personal level [2][3][4]. However, the research focus in the egocentric vision domain has been so far on the detection [3,4] and classification of social interactions based on the kind of relations [5,6], while the problem of fine-grained classification of a specific relation based on the degree of interactivity has been addressed only in [2]. This latter categorization would be of paramount importance to truly understand social interactions and thus to allow an easier human-machine communication.…”
Section: Introductionmentioning
confidence: 99%
“…The objective of the proposed research work is to exploit these properties for summarization of egocentric videos. Egocentric video summarization has useful applications in many domains, i.e., law enforcement [1,3], health care [4], surveillance [5], sports [6], and media [7,8].…”
Section: Introductionmentioning
confidence: 99%