ESA GNC Conference Papers Repository
AIVIONIC - Artificial intelligence techniques in on-board avionics and software
The use of Artificial Intelligence (AI) and in particular Deep Learning (DL) has led to major advances in several industries including automotive, agriculture and healthcare, disrupting traditional approaches and leading to a myriad of novel ground-breaking applications. The space domain has also been reached by the innovation potential of AI, chiefly in Earth Observation applications. Moreover, the increasingly more powerful processing units, together with enhanced and less computationally intensive AI algorithms, make it possible to explore new AI applications, especially for onboard implementations. The objective of the AIVIONIC technology development project is to implement a HW/SW demonstrator of an AI-based Visual Navigation System. This follows a novel development line towards demonstrating the use of AI in space critical systems, in a dependable manner. Neural Network (NN)-based algorithms were identified considering specific mission characteristics such as on-board implementability and algorithm adaptability and flexibility. Lightweight, modular AI processing pipelines were selected and implemented, employing Object Detection and Keypoint Regression Networks which comply with the onboard processing resource and latency restrictions while offering the desired performances. Rigorous validation plays a major role for safety- and mission-critical elements. In AIVIONIC, the AI validation logic followed two complementary approaches. Firstly, validation was performed in all steps of AI development process, starting from the design, prototyping, and training of the AI solutions, and ending in the implementation and validation in the target HW. Both synthetic and laboratory image data sets for the AI-IP which were specifically created during the project. Extensive Monte Carlo campaigns were performed to measure the impact of input data variations, including such in the image data, on the overall navigation performance and to assess the robustness of the AI-IP Navigation pipeline. Secondly, AI runtime monitoring, referring to the active monitoring of the AI algorithms while in operation was implemented to support the AI algorithm validation. The HW platform for the AIVIONIC visual navigation system is composed of Ubotica's CogniSat ecosystem which is based on the Intel family of Myriad Vision Processing Units, together with the Xilinx Zynq UltraScale+, for both image pre-processing and AI inference for the Flight elements of the architecture. The architecture can support multiple VPUs for both redundancy and performance. The obtained results show that the objective of developing a HW/SW demonstrator for a vision-based relative navigation system using AI in a dependable manner has been achieved by the AIVIONIC study, reaching TRL 4. The AI techniques reach the accuracies and latencies to meet the mission requirements, and provide advantages in terms of flexibility and reusability. Data availability and AI dependability methods play a key role for the development and use of AI in space critical systems, and AIVIONIC provides successful solutions for both. The paper will describe the main challenges faced, results obtained, and progress made during the project.