Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems 2018
DOI: 10.1145/3170427.3188664
|View full text |Cite
|
Sign up to set email alerts
|

Detecting Negative Emotion for Mixed Initiative Visual Analytics

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 9 publications
0
5
0
Order By: Relevance
“…Most previous work [14,23,31] solve this problem by manually designing handcrafted features across different sensors and fusing them at decision level [29], which is timeconsuming and often leads to low accuracies. Other works [1,28,33] segment or pad the signals to let them have fixed lengths and train the data with neural networks.…”
Section: Introductionmentioning
confidence: 99%
“…Most previous work [14,23,31] solve this problem by manually designing handcrafted features across different sensors and fusing them at decision level [29], which is timeconsuming and often leads to low accuracies. Other works [1,28,33] segment or pad the signals to let them have fixed lengths and train the data with neural networks.…”
Section: Introductionmentioning
confidence: 99%
“…Following guidelines from previous studies [80,81,82] and following the preferences of extensive user piloting, we sent via text message a survey request to each participant two times per workday. Pilot participants preferred the usage of text message, in part, due to them being accessible and noticeable anywhere in the office.…”
Section: Surveysmentioning
confidence: 99%
“…For engagement: Mixed-initiative interaction systems may leverage bio-sensing and other tools of affective computing to provide personalized and just-in-time guidance to mitigate or prevent frustration and increase engagement in an analysis process (Conati et al, 2013;Panwar and Collins, 2018;McDuff et al, 2012).…”
Section: Goals and Aspects Of Guidancementioning
confidence: 99%
“…For example, Shao et al (2017) used eye-tracking records of which areas of a scatter plot matrix have been explored so far, to inform guidance. Panwar and Collins (2018) use GSR sensing and eye tracking to detect user frustration to provide just-in-time guidance.…”
Section: Obtaining User Input For Guidancementioning
confidence: 99%
See 1 more Smart Citation