Temps-Espace-Société

From Small Body Silhouettes to Pixel-Level Detail: A Synergistic Multi-Scale Framework for 3D Reconstruction and Rotation State Estimation in Deep Space Exploration

par Dr Yaqiong Wang (LTE, Observatoire de Paris)

Europe/Paris
Denisse (Observatoire de Paris)

Denisse

Observatoire de Paris

77 avenue de l'Observatoire
Description

Accurate 3D shape and rotation axis estimation of small celestial bodies are crucial for deep space missions like asteroid sample return and planetary defense, enabling navigation and risk assessment. However, highly variable observation conditions—from distant approach to close-range survey—pose significant challenges for any single modeling method. This work presents a synergistic multi-scale framework that integrates optimal modeling strategies for each observation phase: (1) At far range, a voxel-divided shape-from-silhouette (VD-SFS) method efficiently reconstructs initial shape and rotation axis from silhouette images. (2) In mid-range, a structure-from-motion (SfM) based approach dynamically estimates camera poses and generates global shape and axis models from sparse, feature-distinct images. (3) At close range, a photometry-based method exploits high-resolution, multi-angle images to recover local terrain at pixel-level detail by leveraging surface reflectance and illumination cues. This hierarchical, adaptive framework enables robust, coherent, and autonomous estimation of shape and rotation state throughout the full exploration sequence. Validation using simulated and real mission data (e.g., Rosetta, OSIRIS-REx) demonstrates that the proposed approach substantially improves autonomy, efficiency, and precision in small body exploration.

https://cnrs.zoom.us/j/94599131540?pwd=TaVwZzQGtCnPxNW7HwnyeeiUZApfbc.1

Inscription
Participants