Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems 2016
DOI: 10.1145/2851581.2892426
|View full text |Cite
|
Sign up to set email alerts
|

Assisting Food Journaling with Automatic Eating Detection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
44
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 33 publications
(44 citation statements)
references
References 14 publications
0
44
0
Order By: Relevance
“…In the future, technical solutions aimed at facilitating and automating food logging might overcome this limitation [60]. …”
Section: Discussionmentioning
confidence: 99%
“…In the future, technical solutions aimed at facilitating and automating food logging might overcome this limitation [60]. …”
Section: Discussionmentioning
confidence: 99%
“…Two studies sent participants real-time messages to confirm whether or not they were eating. In Ye et al 62 , participants were sent a short message on their smartwatch if an eating gesture was detected; participants were able to confirm or reject if they were eating in real-time. Similarly, in Gomes and Sousa 48 , when drinking activity was detected, participants were sent an alert on their smartphone and could then confirm or reject if they were drinking in real-time.…”
Section: Ground-truth Methodsmentioning
confidence: 99%
“…On average, 2.10 wearable sensors (SD = 0.96) were used in the studies, with a range of 1-4 sensors ( Table 1). Approximately 63% (N = 25) of the 40 studies utilized an accelerometer (device that determines acceleration) either by itself (N = 4) or incorporated into a sensor system (N = 21) to detect eating activity 18,[37][38][39][40][41]43,45,46,48,49,51,53,[56][57][58][59]62 (Table 2). The second most frequently utilized wearable sensor was a gyroscope (device that determines orientation) (N = 15) 33,38,39,46,48,49,51,53,[56][57][58] , followed by a microphone (N = 8) 34,35,47,52,54,60,61 , a piezoelectric sensor (N = 7) 18,40-42,44,45 , a RF transmitter and receiver (N = 6) 18,40,41,44,45 , and a smartwatch camera (N = 5) 56,57 (Table 2).…”
Section: Wearable Sensorsmentioning
confidence: 99%
“…-Acoustic: [72] -Inertial: [103], [113] -Acoustic: [49] Overview of food intake mechanism classification with corresponded technologies and studies. These will be examined further in Section 5.…”
Section: Food Intake Mechanismsmentioning
confidence: 99%
“…A similar approach, developed by Mendi et al [112], is even adept at sending data from the accelerometer to the smart phone via Bluetooth. In another advance in the most recent study, Ye et al [113,114] proposed an automatic structure using two accelerometers simultaneously: wrist-worn and head-mounted accelerometers on such devices as the Pebble Watch and Google Glass to detect chewing events and eating duration with high accuracy: 89.5% in a dataset of 12,003 epochs in which 3325 epochs are chewing-related under the laboratory condition. Another similar inertial system developed by Amft et al [115] employed four motion sensors at two positions on each arm (upper and lower) to detect drinking and eating gestures and further to detect chewing and swallowing sounds.…”
Section: Inertial Approachmentioning
confidence: 99%