ESA GNC Conference Papers Repository

MSR-ERO Rendezvous Navigation Sensors and Image Processing
Keyvan Kanani, Alex Marchand, Alexandre Falcoz, Christian Kracht, Tristan Röll, Thomas Kämpfe, Antoine Lecocq, Michaël Richert, Pierre Goux, Thierry Bomer, Léo Novelli, Leila Lorenzoni, Manuel Sanchez Gestido
Presented at:
Sopot 2023
Full paper:

The ESA/NASA Mars Sample Return campaign aims to return to Earth samples of Mars materials. The samples will be collected by the NASA provided perseverance rover, assembled in the Orbiting Sample which will then be injected in the Mars orbit by the NASA provided Mars Ascent Vehicle. The ESA provided Earth Return Orbiter (ERO) vehicle will then autonomously detect and rendezvous with the Orbiting Sample (OS) container in low Mars orbit, capture it, seal it, and safely bring it back to Earth. Airbus Defence and Space, under ESA contract, is designing and developing the ERO spacecraft and, in particular, its vision-based GNC system to be used during rendezvous (RDV). The GNC system includes two types of vision sensors: a Narrow Angle Camera (NAC) and a Light Detection and Ranging device (LiDAR), integrating their own image processing to provide ERO navigation with measurements of OS relative position. From the navigation standpoint, the MSR-ERO rendezvous is a non-cooperative rendezvous scenario with a target vehicle (the OS) characterized by an uncontrolled fast tumbling motion. Choice has been made to divide rendezvous into a far range phase (from ~50km to ~500m) and a close range phase (from ~500m to capture). In the far range the vision system is required to provide only OS relative line of sight (LoS) using the Narrow Angle Camera (NAC), provided by Sodern, as main sensor. The NAC has a 4.5deg field of view (FoV) and 1Mpx detector (Faintstar2). It integrates a centre of brightness algorithm robust to proton impacts on the detector. In the close range the main sensor is a scanning LiDAR provided by Jena-Optronik with an adaptable FoV from 0.5° to 40° and a maximum scan frequency of 2Hz which integrates a 3D barycentring processing to assess the OS position from 3D point cloud directly measured by the LiDAR. This paper focuses on the vision sensors of MSR-ERO spacecraft. In a first section, it describes the MSR mission, including its requirements, and recalls the vision sensor trade-off done in the early phases of MSR-ERO project. Then, the vision sensor architecture within the ERO system and the sensors operation concept (board/ground) are presented in the second section and, in the third section, the selected sensors as well as their image processing are described in more details. An overview of their development plans is provided in the conclusions, together with the preliminary performance test results.