ESA GNC Conference Papers Repository
Robust model based tracking through fusion of optical flow and edge tracking techniques
In the context of spacecraft autonomous rendezvous and space debris removal, the estimation of accurate relative pose between target and chaser spacecraft is a central issue. Accurate full 6DOF relative pose is needed to estimate the target attitude state and to safely maneuver the chaser spacecraft towards the capture point. LIDAR solutions have large power demands and some limitations at very close range, and in any case an independent system is required for FDIR or robustness, so there is an increasing interest in applying vision-based techniques for this matter. Provided that a 3D model of the target spacecraft is known, model based tracking techniques have been proposed in the past. They consist in the alignment of some features identified in the model with their corresponding projections in the image captured by the camera. Edge features offer a good invariance to illumination changes or image noise and are particularly suitable with poorly textured scenes. Still, illumination conditions in space are far from optimal, leading to poor performance in some conditions and even to the loss of target tracking. We present some crucial enhancements over the edge model based tracking scheme as the ability to keep track only over illuminated edges (and not take into account shadowed ones, which could be misleading) but also a novel approach on how to initialize the edge tracking solution for every new frame of the sequence. This is a major contribution, as on every new frame the algorithm has to compute a minimization of the distance between edges found on the image and the re-projection of the target edges on the image. If this minimization process gets stuck in a local minima, the computed relative pose will not be correct and the system can lose track of the target, with very negative consequences. We take advance of sparse optical flow techniques in order to properly initialize the edge based tracking, leading to a tracking scheme more robust to illumination changes and high speed maneuvers. These optical flow techniques cannot be used on their own to track the target, as they suffer from drift over time and because they can also drift with illumination changes. However, they are not prone to fall in the same local minima as the edge tracking techniques, and therefore they can provide a good initialization to the edge tracking to converge to the right solution and correct the remaining drift. In short, our approach is to take benefit from the best of both worlds in order to implement a robust tracking scheme. Finally, advancing towards a flyable system, the proposed algorithm has been implemented on a FPGA exploiting the possibility of performing parallel computations in a pipelined system. We started with a software implementation and incrementally accelerate the elaborations on the FPGA, this allowed us to obtain a complete hardware-software design, and experiment with different configurations in order to test and select the best tradeoff between performance and hardware resource consumption according to the specific scenario requirements. A fully working prototype has been implemented on a Xilinx Zynq based system; the functional core with the most intensive computations has been implemented on a space qualified system based on a Xilinx Virtex5, showing the portability of the design.