CHI Conference on Human Factors in Computing Systems 2022
DOI: 10.1145/3491102.3502105
|View full text |Cite
|
Sign up to set email alerts
|

ControllerPose: Inside-Out Body Capture with VR Controller Cameras

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
3
3

Relationship

0
9

Authors

Journals

citations
Cited by 26 publications
(6 citation statements)
references
References 45 publications
0
6
0
Order By: Relevance
“…In contrast to external body pose capture systems that include high-end commercial marker-based camera systems [42], depth and RGB [10,16,39] cameras as well as non-optical systems [26,67,68], body-worn systems enable tracking in mobile settings, and do not require the user to remain within a constraint area. To receive a wider view of the user's body for tracking, cameras with wide angle-lenses were attached to various parts of the body including the wrist [62], the feet [7], and the chest [25,27] or alternatively integrated into controllers [3] or using a suspension on the headset [1,46] further away from the user's body. However, these solutions tend to suffer from occlusion and are often obtrusive to wear.…”
Section: Body Capture Using Worn Sensorsmentioning
confidence: 99%
“…In contrast to external body pose capture systems that include high-end commercial marker-based camera systems [42], depth and RGB [10,16,39] cameras as well as non-optical systems [26,67,68], body-worn systems enable tracking in mobile settings, and do not require the user to remain within a constraint area. To receive a wider view of the user's body for tracking, cameras with wide angle-lenses were attached to various parts of the body including the wrist [62], the feet [7], and the chest [25,27] or alternatively integrated into controllers [3] or using a suspension on the headset [1,46] further away from the user's body. However, these solutions tend to suffer from occlusion and are often obtrusive to wear.…”
Section: Body Capture Using Worn Sensorsmentioning
confidence: 99%
“…In the context of metaverse VR applications, apart from many studies on its enabling technologies [18,27,38,48] such as AR/VR [1,24,26,29,30,34,42], motion sensing [3], and platform governance [9,10,19], their architectural problems and scalability issues to be solved by metaverse developers (e.g., bandwidth consumptions with increasing number of users, CPU/GPU utilization of terminal device, and rendering cost of user avatars) have been studied in [13,14]. The authors of [14] have also pointed out that one critical scalability issue that could be solved by changing the operational architecture of metaverses is that the consumptions of user bandwidth and CPU/GPU resources currently depend on the number of active users, thus, can introduce high overhead on the VR headset -one potential solution is remote rendering, i.e., rendering application graphics on the cloud server instead of on user VR devices.…”
Section: Related Workmentioning
confidence: 99%
“…Various user input data from consumer‐grade devices can be used to synthesize full‐body animation for characters in real time. For example, some studies used optical data from egocentric cameras mounted in a baseball cap [XCZ*19], a HMD [TAP*20; YCQ*22], controllers [ASF*22], or glasses [ZWMF21] to estimate the body pose. Egocentric cameras suffer from extreme perspective distortion and self‐occlusion that lead to inadequate tracking information for the lower body.…”
Section: Related Workmentioning
confidence: 99%