ESA GNC Conference Papers Repository

Title:
Generation of Artificial Infrared Camera Images for Visual Navigation Simulation
Authors:
Krystian Zy?k, Micha? Ha?o?, Kacper Kaczmarek, Marcin Kasprzyk, Piotr Rodo, Olgierd Skromak, Mateusz Sochacki
Presented at:
Sopot 2023
DOI:
Full paper:
Abstract:

Visual navigation is a cornerstone of many modern navigation systems, from computer mice sensors through cars, guided missiles, spacecraft proximity operations and landers. Since 2017 the StudentsÂ’ Space Association at Warsaw University of Technology has been developing the FOK rocket - a guidance, navigation and control research and development platform. The rocket is aerodynamically controlled using canards and is capable of reaching an apogee of about 1700 metres, a speed of Mach 0.6. It is fully reusable thanks to a parachute recovery system. Currently, the team is developing a vision-based navigation system composed of visible-light and infrared seekers. The goal of the mission is to guide the rocket towards ground-based visible or infrared markers. As part of the project, the team works on software-in-the-loop testing of developed GNC algorithms. The tests are conducted within the SKA RFS software - an in-house developed tool for simulating a variety of sounding rockets. It is a 6DOF flight simulation tool able to simulate multi-staged rockets, equipped with multiple engines, multiple parachutes, and aerodynamic control systems. It also allows conducting simulations of a stream from the rocket-mounted camera. As the next iteration of the FOK project aims to use an infrared camera, the team faced the challenge of simulating infrared camera images within the RFS software to test the developed vision navigation algorithms. This paper will present techniques for generating artificial infrared images for visual navigation simulation. Artificial images will be generated based on source images taken in the visible-light spectrum. Two methods for artificial image generation will be investigated: the first using a simple correlation between the visible and infrared images and the second utilising neural networks. The correlation method compares a set of reference infrared and visible-light spectrum images of the same scene to determine a correlation between red, green and blue channels and an infrared channel. Several correlation functions are used: simple linear correlation between single visible and infrared pixel intensities and linear correlations of pixel grids of various sizes. The correlation functions will be determined using the least square method. The neural network method will use a generative adversarial network (GAN) to generate artificial infrared images based on provided visible spectrum images. The same set of source materials will be used for training as in the correlation method. Two sets of training images are considered: images acquired using an integrated infrared-visible spectrum camera mounted on an unmanned aerial vehicle (UAV) and additionally images from Earth observation satellites (namely the data from the European Copernicus system). Images from the integrated infrared-visible camera will be acquired over relevant scenes - fields, forests, heath - landscapes usually encountered during the prospective flight of the FOK rocket. The paper will conclude with a detailed comparison of the developed methods both in terms of quantitative metrics describing the similarity of artificially generated images and original infrared images, speed of execution and required memory, as well as in terms of subjective perception of the generated images. Examples of generated artificial imagery will be presented based on simulated FOK rocket flights as well as recording from relevant drone flight.