ESA GNC Conference Papers Repository

Vision-based pose estimation and relative navigation around uncooperative space objects
V.P. Pesce, M.L. Lavagna, S. Sarno, R.O. Opromolla, M.G. Grassi
Presented at:
Salzburg 2017
Full paper:

Autonomous relative navigation in space has been intensively studied in the last decades due to its vast range of applications. In particular, precise pose and motion estimation of an uncooperative object, such as a Resident Space Object (RSO) has a potential utilization in the domain of space debris removal. This paper investigates the possibility to combine image processing and pose estimation techniques with robust filtering methods. The proposed approach is composed by two main blocks. The first block provides estimates of the pose (relative position and attitude) of the observed uncooperative object by processing images, acquired by a monocular camera. Assuming partial knowledge of the shape of the inspected space object (target), classical corner detection techniques are used to extract feature points, which are then used to estimate the target relative position and attitude parameters. This is done by looking for the pose solution which minimizes the error obtained when trying to match the extracted features from target model. The second block uses these position and attitude data to correct the estimation of the relative state provided by a navigation filter. Among all the possible strategies, Kalman filters have been chosen for their robustness. Their simplicity guarantees very good computational performance, particularly important in real-time estimation applications. Due to the non-linearity of the equations of the relative motion, an Extended Kalman Filter (EKF) is proposed. In addition, due to the high level of uncertainty of this kind of scenario, a robust non-linear filtering approach is used. In particular, the so-called Extended H-? Filter is tested. This method makes the EKF estimation more robust by using the H-? techniques, usually applied to linear cases. A numerical validation of the filtering part is presented by using, as input, simulated measurements. The measured pose and its covariance matrix are obtained by reproducing vision-based pose determination errors and frequencies. The results of the classical EKF and of its robust version are critically compared. Moreover, preliminary results of an experimental validation of the complete algorithm are presented. An experimental setup at the Politecnico di Milano Department of Aerospace Science and Technologies has been developed. A 7-DOF robotic arm (Mitshubishi PA-10) is used to replicate the relative motion of the target with respect to the chaser. On the robotic arm end-effector, a satellite mock-up is installed. A fixed camera acquires images of the mock-up. An off-line image processing for pose reconstruction is then performed. The acquired data are fed into the filtering block as measurements. The results obtained with the experimental setup are finally analysed and compared to the numerical ones.