2018 International Workshop on Advanced Image Technology (IWAIT) 2018
DOI: 10.1109/iwait.2018.8369761
|View full text |Cite
|
Sign up to set email alerts
|

Sleep posture classification with multi-stream CNN using vertical distance map

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
8
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 19 publications
(8 citation statements)
references
References 13 publications
0
8
0
Order By: Relevance
“…Some studies use depth cameras [16][17][18][19]. Grimm et al [16] use depth maps to detect sleep posture.…”
Section: Sleep Posture Classificationmentioning
confidence: 99%
“…Some studies use depth cameras [16][17][18][19]. Grimm et al [16] use depth maps to detect sleep posture.…”
Section: Sleep Posture Classificationmentioning
confidence: 99%
“…A multi-stream CNN was also applied in this field, which was based on depth images for identifying ten sleep postures with high accuracy [14]. A sleep monitoring system by embedding radio frequency identification (RFID) tags was proposed for sleeping posture recognition and body movement detection, which used a convolutional neural network (CNN) to identify the sleeping postures.…”
Section: A Sleep Posture Recognitionmentioning
confidence: 99%
“…However, this kind of sleeping posture is easy to lead to the fall of the tongue root and block breathing, which is not suitable for people who often snore or have respiratory diseases. Lying on the right side is conducive to the normal operation of the gastrointestinal tract and will not compress the heart, but it can affect the movement of the right lung [2]. On the other hand, monitoring the sleep stage is also an important way to evaluate sleep state.…”
Section: Introductionmentioning
confidence: 99%
“…It's possible to find many other recent works in the literature that use CNN in image classification in different tasks (eg. : [25], [26] and [27]).…”
Section: Introductionmentioning
confidence: 99%