Continuous-Time Multi-Sensor Odometry in the Wild
Semester/Masters Project
The goal of this project is to develop a robust, continuous-time multi-sensor odometry system that can handle multi-rate synchronization.
Visual-inertial odometry with a UAV
Background
Multi-sensor odometry estimates a robot’s full 6-DoF pose by fusing complementary data streams—cameras, LiDAR, IMU, and GNSS—each sampled at its own rate. Representing motion as a continuous-time trajectory, rather than as a sequence of discrete poses, simplifies multi-rate synchronization [1,2]. Each modality breaks down under different conditions: visual–inertial odometry drifts during rapid rotations, abrupt illumination changes, or texture-poor scenes, while LiDAR–inertial odometry loses observability in geometrically featureless environments. Recent work [1] proposes a fusion strategy for robust odometry, but it still relies on tightly synchronized sensors.
Description
The goal of this project is to develop a robust, continuous-time multi-sensor odometry system that can handle multi-rate synchronization. The performance of the resulting approach will be evaluated in comparison with existing multi-sensor odometry approaches.
Work Packages
- Literature review of work on multi-sensor odometry
- Literature review of work on continuous-time trajectory
- Design a multi-sensor fusion strategy
- Evaluate the performance of the approach in comparison with existing work
Requirements
- Experience with C++ and ROS
References
- [1] Cioffi, G., Cieslewski, T., & Scaramuzza, D., “Continuous-time vs. discrete-time vision-based SLAM: A comparative study”, IEEE Robotics and Automation Letters, 2022.
- [2] Hug, David, Ignacio Alzugaray, and Margarita Chli., “Hyperion–A Fast, Versatile Symbolic Gaussian Belief Propagation Framework for Continuous-Time SLAM”, European Conference on Computer Vision. Cham: Springer Nature Switzerland, 2024.
- [3] C. Zheng et al., “FAST-LIVO2: Fast, Direct LiDAR-Inertial-Visual Odometry”, IEEE Transactions on Robotics, 2024.