2019
DOI: 10.1109/jssc.2018.2886342
|View full text |Cite
|
Sign up to set email alerts
|

Navion: A 2-mW Fully Integrated Real-Time Visual-Inertial Odometry Accelerator for Autonomous Navigation of Nano Drones

Abstract: This paper presents Navion, an energy-efficient accelerator for visual-inertial odometry (VIO) that enables autonomous navigation of miniaturized robots (e.g., nano drones), and virtual/augmented reality on portable devices. The chip uses inertial measurements and mono/stereo images to estimate the drone's trajectory and a 3D map of the environment. This estimate is obtained by running a state-of-the-art VIO algorithm based on non-linear factor graph optimization, which requires large irregularly structured me… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
83
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 117 publications
(83 citation statements)
references
References 34 publications
0
83
0
Order By: Relevance
“…We find a wide variety of high-level algorithms in autonomous drones are dependent on a core family of algorithms, namely, simultaneous localization and mapping (SLAM) and visual odometry (VO) [39][40][41][42][43]. These algorithms are the fundamental building blocks for many autonomous technologies [19,40,44] and are used in various tasks such as navigation, obstacle avoidance, and path planning. Designing drone systems that provide accurate localization in realtime on platforms with limited computational and energy resources is an active area of research [18,19,24].…”
Section: Autonomy In Dronesmentioning
confidence: 99%
See 2 more Smart Citations
“…We find a wide variety of high-level algorithms in autonomous drones are dependent on a core family of algorithms, namely, simultaneous localization and mapping (SLAM) and visual odometry (VO) [39][40][41][42][43]. These algorithms are the fundamental building blocks for many autonomous technologies [19,40,44] and are used in various tasks such as navigation, obstacle avoidance, and path planning. Designing drone systems that provide accurate localization in realtime on platforms with limited computational and energy resources is an active area of research [18,19,24].…”
Section: Autonomy In Dronesmentioning
confidence: 99%
“…These algorithms are the fundamental building blocks for many autonomous technologies [19,40,44] and are used in various tasks such as navigation, obstacle avoidance, and path planning. Designing drone systems that provide accurate localization in realtime on platforms with limited computational and energy resources is an active area of research [18,19,24]. Therefore, to date, various implementations of SLAM with the focus on algorithmic-level optimizations [39,[41][42][43]45] or hardware acceleration [44,[46][47][48][49][50][51] have been proposed.…”
Section: Autonomy In Dronesmentioning
confidence: 99%
See 1 more Smart Citation
“…For intelligent robot technology, visual semantic SLAM (Simultaneous Localization and Mapping) that merges semantic information is a potential use of visual semantic segmentation for intelligent robot technology. It has been proven that the classic VSLAM (Visual SLAM) technology is an appropriate solution to the positioning and navigation of mobile robots [ 7 , 8 ], and that it can be implemented on low-power embedded platforms [ 9 , 10 , 11 ]. However, classic VSLAM is mostly based on low-level computer vision features (points, lines, etc.)…”
Section: Introductionmentioning
confidence: 99%
“…The hardware platform for processing of the SLAM algorithm on mobile robots mainly includes a CPU (Central Processing Unit) [ 11 ], FPGA [ 10 , 19 ], and ASIC (Application Specific Integrated Circuit) [ 9 , 20 ]. The platform for the semantic segmentation network is usually based on a GPU [ 15 ].…”
Section: Introductionmentioning
confidence: 99%