2021
DOI: 10.3390/jimaging7120255
|View full text |Cite
|
Sign up to set email alerts
|

Evaluation of 2D-/3D-Feet-Detection Methods for Semi-Autonomous Powered Wheelchair Navigation

Abstract: Powered wheelchairs have enhanced the mobility and quality of life of people with special needs. The next step in the development of powered wheelchairs is to incorporate sensors and electronic systems for new control applications and capabilities to improve their usability and the safety of their operation, such as obstacle avoidance or autonomous driving. However, autonomous powered wheelchairs require safe navigation in different environments and scenarios, making their development complex. In our research,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(1 citation statement)
references
References 48 publications
(57 reference statements)
0
1
0
Order By: Relevance
“…Despite this, the use of a single monocular camera does not come without problems. In fact, other setups presented in the literature [ 21 ], which can be used to cut down on the cost of hardware needed for automatic control of a PW, consisted of a stereo camera and an RGB camera to detect and track the feet of the caregiver based on the Tiny YOLO (You Only Look Once) neural network. In this case, although the use of Convolutional Neural Networks brings great robustness and accuracy of results, in accordance with what was demonstrated in [ 22 ], the setup employed had this critical issue: it required an additional depth camera, whether based on LiDAR or stereo camera technology, because distance estimation with the monocular camera exhibited very limited accuracy.…”
Section: Introductionmentioning
confidence: 99%
“…Despite this, the use of a single monocular camera does not come without problems. In fact, other setups presented in the literature [ 21 ], which can be used to cut down on the cost of hardware needed for automatic control of a PW, consisted of a stereo camera and an RGB camera to detect and track the feet of the caregiver based on the Tiny YOLO (You Only Look Once) neural network. In this case, although the use of Convolutional Neural Networks brings great robustness and accuracy of results, in accordance with what was demonstrated in [ 22 ], the setup employed had this critical issue: it required an additional depth camera, whether based on LiDAR or stereo camera technology, because distance estimation with the monocular camera exhibited very limited accuracy.…”
Section: Introductionmentioning
confidence: 99%