2016
DOI: 10.1109/mprv.2016.36
|View full text |Cite
|
Sign up to set email alerts
|

Opportunistic and Context-Aware Affect Sensing on Smartphones

Abstract: Opportunistic affect sensing offers unprecedented potential for capturing spontaneous affect, eliminating biases inherent in the controlled setting. Facial expression and voice are two major affective displays, however most affect sensing systems on smartphone avoid them due to extensive power requirements. Encouragingly, due to the recent advent of low-power DSP (Digital Signal Processing) co-processor and GPU (Graphics Processing Unit) technology, audio and video sensing are becoming more feasible on smartph… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2017
2017
2019
2019

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 18 publications
(11 citation statements)
references
References 39 publications
0
11
0
Order By: Relevance
“…This discussion only provides a limited list of the ethical challenges and does not offer specific solutions to these complex problems. Many significant issues were omitted such as how patient monitoring systems handle data that are inadvertently captured about other people such as facial images, voice recordings, and metadata (Rana et al 2016 ), and new legal issues such as timeliness of response to monitoring data (Armontrout et al 2016 ). Other issues that were omitted include whether health-related chatbots (automated conversational software) should deceive patients into thinking they are interacting with a human (Whitby 2014 ), the coming of medications with sensors for adherence monitoring (Kane et al 2013 ), the monitoring of people with dementia (Niemeijer et al 2011 ), and the evaluation of long-term clinical value.…”
Section: Limitationsmentioning
confidence: 99%
“…This discussion only provides a limited list of the ethical challenges and does not offer specific solutions to these complex problems. Many significant issues were omitted such as how patient monitoring systems handle data that are inadvertently captured about other people such as facial images, voice recordings, and metadata (Rana et al 2016 ), and new legal issues such as timeliness of response to monitoring data (Armontrout et al 2016 ). Other issues that were omitted include whether health-related chatbots (automated conversational software) should deceive patients into thinking they are interacting with a human (Whitby 2014 ), the coming of medications with sensors for adherence monitoring (Kane et al 2013 ), the monitoring of people with dementia (Niemeijer et al 2011 ), and the evaluation of long-term clinical value.…”
Section: Limitationsmentioning
confidence: 99%
“…Smartphones, which embed all these sensors, can be considered a type of wearable due to their pervasiveness. Furthermore, the low adoption barrier on healthcare applications [28] through application markets such as Google Play or AppStore makes them the best option to target the mass market. Some of them are focused on fall detection [29,30], but normally do not cover both ADL and falls [31], so a classification system must be designed to consider them.…”
Section: Activity Recognition Systems For Eldersmentioning
confidence: 99%
“…Despite of being a complex and very CPU intensive process, the analysis of voice and facial expression can also be used as a monitoring mechanism [20]. It allows to get an accurate approximation about current well-being levels.…”
Section: A Monitoring Mechanismsmentioning
confidence: 99%
“…According to Rana, Margee, Reilly, Jurdak and Soar (2016), another factor almost never monitored during this kind of studies is opportunistic facial expression, being this a factor that allows inferring the current mood state with high accuracy [20].…”
Section: A Monitoring Mechanismsmentioning
confidence: 99%