2022 7th International Conference on Communication, Image and Signal Processing (CCISP) 2022
DOI: 10.1109/ccisp55629.2022.9974277
|View full text |Cite
|
Sign up to set email alerts
|

Environment Perception Technology for Intelligent Robots in Complex Environments: A Review

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 28 publications
0
3
0
Order By: Relevance
“…Machine-learning approaches, specifically probabilistic methods like Bayesian filtering, play a pivotal role in combining information from various sensors (e.g., cameras, LiDAR, IMUs) to generate a coherent representation of the environment [148][149][150]. Moreover, Kalman Filters and their variants (Extended Kalman Filter, Unscented Kalman Filter) are extensively used for sensor fusion, enabling estimation of the robot's state and the environment's structure [151][152][153]. Additionally, Particle Filters provide a non-parametric approach to handle non-linear and non-Gaussian distributions [154,155].…”
Section: Sensor Fusion For Multi-modal Perceptionmentioning
confidence: 99%
“…Machine-learning approaches, specifically probabilistic methods like Bayesian filtering, play a pivotal role in combining information from various sensors (e.g., cameras, LiDAR, IMUs) to generate a coherent representation of the environment [148][149][150]. Moreover, Kalman Filters and their variants (Extended Kalman Filter, Unscented Kalman Filter) are extensively used for sensor fusion, enabling estimation of the robot's state and the environment's structure [151][152][153]. Additionally, Particle Filters provide a non-parametric approach to handle non-linear and non-Gaussian distributions [154,155].…”
Section: Sensor Fusion For Multi-modal Perceptionmentioning
confidence: 99%
“…In robotics, a 4D mmw radar can be integrated as a perception sensor to enable robots to detect and track objects, relocate, and avoid obstacles in their environment [55][56][57][58], which is similar to the applications in autonomous driving. In addition, some researchers have also applied this technology to develop human-following robots [59] and estimate the velocity of robots in visually degraded environments [60].…”
Section: Application In Roboticsmentioning
confidence: 99%
“…Another review by Wu et al (2022) discusses the limitations of vision-based environmental perception technologies. It emphasizes the importance of multi-sensor fusion for enhanced adaptability and performance in complex, unstructured conditions, offering insights into various application scenarios, datasets, and sensor fusion methods.…”
Section: Significance Of the Surveymentioning
confidence: 99%