Background Eating behavior has a high impact on the well-being of an individual. Such behavior involves not only when an individual is eating, but also various contextual factors such as with whom and where an individual is eating and what kind of food the individual is eating. Despite the relevance of such factors, most automated eating detection systems are not designed to capture contextual factors. Objective The aims of this study were to (1) design and build a smartwatch-based eating detection system that can detect meal episodes based on dominant hand movements, (2) design ecological momentary assessment (EMA) questions to capture meal contexts upon detection of a meal by the eating detection system, and (3) validate the meal detection system that triggers EMA questions upon passive detection of meal episodes. Methods The meal detection system was deployed among 28 college students at a US institution over a period of 3 weeks. The participants reported various contextual data through EMAs triggered when the eating detection system correctly detected a meal episode. The EMA questions were designed after conducting a survey study with 162 students from the same campus. Responses from EMAs were used to define exclusion criteria. Results Among the total consumed meals, 89.8% (264/294) of breakfast, 99.0% (406/410) of lunch, and 98.0% (589/601) of dinner episodes were detected by our novel meal detection system. The eating detection system showed a high accuracy by capturing 96.48% (1259/1305) of the meals consumed by the participants. The meal detection classifier showed a precision of 80%, recall of 96%, and F1 of 87.3%. We found that over 99% (1248/1259) of the detected meals were consumed with distractions. Such eating behavior is considered “unhealthy” and can lead to overeating and uncontrolled weight gain. A high proportion of meals was consumed alone (680/1259, 54.01%). Our participants self-reported 62.98% (793/1259) of their meals as healthy. Together, these results have implications for designing technologies to encourage healthy eating behavior. Conclusions The presented eating detection system is the first of its kind to leverage EMAs to capture the eating context, which has strong implications for well-being research. We reflected on the contextual data gathered by our system and discussed how these insights can be used to design individual-specific interventions.
Deviant eating behavior such as skipping meals and consuming unhealthy meals has a significant association with mental well-being in college students. However, there is more to what an individual eats. While eating patterns form a critical component of their mental well-being, insights and assessments related to the interplay of eating patterns and mental well-being remain under-explored in theory and practice. To bridge this gap, we use an existing real-time eating detection system that captures context during meals to examine how college students’ eating context associates with their mental well-being, particularly their affect, anxiety, depression, and stress. Our findings suggest that students’ irregularity or skipping meals negatively correlates with their mental well-being, whereas eating with family and friends positively correlates with improved mental well-being. We discuss the implications of our study in designing dietary intervention technologies and guiding student-centric well-being technologies.
Routine blood pressure (BP) measurement in pregnancy is commonly performed using automated oscillometric devices. Since no wireless oscillometric BP device has been validated in preeclamptic populations, a simple approach for capturing readings from such devices is needed, especially in low-resource settings where transmission of BP data from the field to central locations is an important mechanism for triage. To this end, a total of 8192 BP readings were captured from the Liquid Crystal Display (LCD) screen of a standard Omron M7 self-inflating BP cuff using a cellphone camera. A cohort of 49 lay midwives captured these data from 1697 pregnant women carrying singletons between 6 weeks and 40 weeks gestational age in rural Guatemala during routine screening. Images exhibited a wide variability in their appearance due to variations in orientation and parallax; environmental factors such as lighting, shadows; and image acquisition factors such as motion blur and problems with focus. Images were independently labeled for readability and quality by three annotators (BP range: 34–203 mm Hg) and disagreements were resolved. Methods to preprocess and automatically segment the LCD images into diastolic BP, systolic BP and heart rate using a contour-based technique were developed. A deep convolutional neural network was then trained to convert the LCD images into numerical values using a multi-digit recognition approach. On readable low- and high-quality images, this proposed approach achieved a 91% classification accuracy and mean absolute error of 3.19 mm Hg for systolic BP and 91% accuracy and mean absolute error of 0.94 mm Hg for diastolic BP. These error values are within the FDA guidelines for BP monitoring when poor quality images are excluded. The performance of the proposed approach was shown to be greatly superior to state-of-the-art open-source tools (Tesseract and the Google Vision API). The algorithm was developed such that it could be deployed on a phone and work without connectivity to a network.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.