Search citation statements
Paper Sections
Citation Types
Year Published
Publication Types
Relationship
Authors
Journals
This paper proposes Thales Alenia Space visionbased navigation solution for close proximity operations in autonomous space rendezvous with non-cooperative targets. The proposed solution covers all the phases of the navigation. First, a neural network robustly extracts the target silhouette from complex background. Then, the binary silhouette is used to retrieve the initial relative pose using a detection algorithm. We propose an innovative approach to retrieve the object's pose using a precomputed set of invariants and geometric moments. The observation is extended over a set of consecutive frames in order to allow the rejection of outlying measurements and to obtain a robust pose initialization. Once an initial estimate of the pose is acquired, a recursive tracking algorithm based on the extraction and matching of the observed silhouette contours with the 3D geometric model of the target is initialized. The detection algorithm is run in parallel to the tracker in order to correct the tracking in case of diverging measurements. The measurements are then integrated into a dynamic filter, increasing the robustness of target pose estimation, allowing the estimation of target translational velocity and rotation rate, and implementing a computationally efficient delay management technique that allows merging delayed and infrequent measurements. The overall Navigation solution has a low computational load, which makes it compatible with space-qualified microprocessors. The solution is tested and validated in different close proximity scenarios using synthetic images generated with Thales Alenia Space rendering engine SpiCam.
This paper proposes Thales Alenia Space visionbased navigation solution for close proximity operations in autonomous space rendezvous with non-cooperative targets. The proposed solution covers all the phases of the navigation. First, a neural network robustly extracts the target silhouette from complex background. Then, the binary silhouette is used to retrieve the initial relative pose using a detection algorithm. We propose an innovative approach to retrieve the object's pose using a precomputed set of invariants and geometric moments. The observation is extended over a set of consecutive frames in order to allow the rejection of outlying measurements and to obtain a robust pose initialization. Once an initial estimate of the pose is acquired, a recursive tracking algorithm based on the extraction and matching of the observed silhouette contours with the 3D geometric model of the target is initialized. The detection algorithm is run in parallel to the tracker in order to correct the tracking in case of diverging measurements. The measurements are then integrated into a dynamic filter, increasing the robustness of target pose estimation, allowing the estimation of target translational velocity and rotation rate, and implementing a computationally efficient delay management technique that allows merging delayed and infrequent measurements. The overall Navigation solution has a low computational load, which makes it compatible with space-qualified microprocessors. The solution is tested and validated in different close proximity scenarios using synthetic images generated with Thales Alenia Space rendering engine SpiCam.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.