ESA GNC Conference Papers Repository

Generating a Synthetic Event-Based Vision Dataset for Navigation and Landing
Loïc Azzalini, Emmanuel Blazquez, Alexander Hadjiivanov, Gabriele Meoni, Dario Izzo
Presented at:
Sopot 2023
Full paper:

Neuromorphic vision technology holds great promise for space applications owing to its low power consumption, high temporal precision and high dynamic range [1][2]. Dynamic vision sensors, or event-cameras as they are commonly known, have disrupted research fields across the computer vision landscape, from object tracking to image reconstruction, in robotics to the automotive industry [3]. A dynamic vision sensor is event-driven by design, as independently sensitive image pixels only respond to changes in scene brightness, leading to sparse and asynchronous output streams of data. Only motion between the scene and the camera will generate data, thus avoiding the capture of redundant static information and wasteful usage of onboard resources. Recent demonstrations of dynamic vision for in-orbit space situational awareness (SSA) [4][5] are encouraging, warranting further adoption of event-based hardware in the space sector. Whether tracking space debris or landing on a planetary body or asteroid, events are captured in the image plane of the dynamic vision sensor as a result of the relative dynamics between the onboard camera and the space environment. While event-based vision for guidance navigation and control (GNC) has been considered at a theoretical level in the past [1], onboard demonstrations of this technology have yet to be deployed. Its limited adoption to date can, in part, be explained by the lack of event-based datasets tailored to vision-based navigation in space and state-of-the-art event processing tools found in other disciplines (e.g., robotics). Our main contribution is a software pipeline to generate event-based datasets from simulated spacecraft trajectories in a photorealistic scene generator for planetary bodies and asteroids. Various landing scenarios are considered in this work to illustrate how the pipeline may be configured to capture event-based representation of surface features. We will base an upcoming scientific crowd-sourcing initiative [6] on this landing dataset to engage the wider computer vision communities in determining how best to handle event-based data for navigation and landing. We propose a flexible data pipeline which takes in trajectory specifications as input and outputs streams of events corresponding to the motion of features in the scene. From the initial conditions of the spacecraft and the properties of the target body, we solve an optimal control problem corresponding to a non-ventral, minimum-mass descent trajectory on the Moon and Mars. The trajectories are then used to manipulate the viewpoint of a pinhole camera model in the Planet and Asteroid Natural Scene Generator (PANGU) [7] which renders synthetic images of the surface during approach. Finally, a video-to-event converter [8] is used to generate synthetic events induced by the simulated landings. The sparse and asynchronous events include various artifacts (noise, motion blur, etc.) modelled after the performance of dynamic vision sensors in the field. The resulting dataset captures dynamic, event-based representations of common surface features such as craters, boulders and the target body's horizon. The upcoming data challenge will focus on the representation of event-based data, itself a new paradigm in computer vision, and its processing to meet navigation and landing objectives. The success of previous scientific crowd-sourcing initiatives on spacecraft pose estimation [9][10] testifies to the effectiveness of competitive data challenges in garnering interest and solutions to novel optical navigation opportunities. Beyond the data challenge, the event-based landing dataset is envisioned to support investigations into future onboard opportunities for event- and vision-based navigation, including optical-flow-based motion estimation, surface feature identification and tracking, and terrain relative navigation. By releasing this pipeline, we hope to promote the creation of new datasets for event-based navigation around other planetary bodies and asteroids, and to support the development of state-of-the-art event processing tools for future space missions. References: [1] Izzo, D., Hadjiivanov, A., Dold, D., Meoni, G., & Blazquez, E. (2022). Neuromorphic computing and sensing in space. arXiv preprint arXiv:2212.05236. [2] Roffe, S., Akolkar, H., George, A. D., Linares-Barranco, B., & Benosman, R. B. (2021). Neutron-induced, single-event effects on neuromorphic event-based vision sensor: A first step and tools to space applications. IEEE Access, 9 , 85748-85763. doi: 10.1109/ACCESS.2021.3085136 [3] Gallego, G., Delbrück, T., Orchard, G., Bartolozzi, C., Taba, B., Censi, A., ... Scaramuzza, D. (2022). Event-based vision: A survey. IEEE Transactions on Pattern Analysis and Machine Intelligence, 44 (1), 154-180. doi: 10.1109/TPAMI.2020.3008413 [4] Roffe, S., Schwarz, T., Cook, T., Perryman, N., Goodwill, J., Gretok, E., ... George, A. (2020). CASPR: Autonomous Sensor Processing Experiment for STP-H7. Retrieved from [5] McHarg, M. G., Balthazor, R. L., McReynolds, B. J., Howe, D. H., Maloney, C. J., O’Keefe, D., ... Cohen, G. (2022). Falcon Neuro: an event-based sensor on the International Space Station. Optical Engineering, 61 (8), 085105. Retrieved from doi: 10.1117/1.OE.61.8.085105 [6] Kelvins - European Space Agency's Advanced Concepts Competition Website, [7] Martin, I., & Dunstan, M. (2021, November 20). Pangu v6: Planet and asteroid natural scene generation utility. [8] Hu, Y., Liu, S. C., & Delbruck, T. (2021). v2e: From video frames to realistic DVS events. In 2021 IEEE/CVF conference on computer vision and pattern recognition workshops (CVPRW). IEEE. [9] Kisantal, M., Sharma, S., Park, T. H., Izzo, D., Martens, M., & D’Amico, S. (2020). Satellite pose estimation challenge: Dataset, competition design, and results. IEEE Transactions on Aerospace and Electronic Systems, 56 (5), 4083–4098. doi: 10.1109/taes.2020.2989063 [10] Park, T. H., Martens, M., Lecuyer, G., Izzo, D., & D’Amico, S. (2022). SPEED+: Next-generation dataset for spacecraft pose estimation across domain gap. 2022 IEEE Aerospace Conference (AERO). doi:10.1109/aero53065.2022.9843439