Continuous-Time Multi-Sensor Odometry in the Wild
Semester/Masters Project
The goal of this project is to develop a robust, continuous-time multi-sensor odometry system that can handle multi-rate synchronization.
Odometry degradation scenarios shown in [4]
Background
Multi-sensor odometry estimates a robot’s full 6-DoF pose by fusing complementary data streams—cameras, LiDAR, IMU, and GNSS—each sampled at its own rate. Representing motion as a continuous-time trajectory, rather than as a sequence of discrete poses, simplifies multi-rate synchronization [1,2]. Each modality breaks down under different conditions: visual–inertial odometry drifts during rapid rotations, abrupt illumination changes, or texture-poor scenes, while LiDAR–inertial odometry loses observability in geometrically featureless environments. Recent work [1] proposes a fusion strategy for robust odometry, but it still relies on tightly synchronized sensors.
Description
Real robots don’t live in perfect lab conditions. So how do we estimate motion when sensors are noisy, unsynchronized, and running at different speeds? This project takes a modern approach by modelling motion in continuous time, allowing seamless fusion of camera, LiDAR, and IMU data. You’ll explore trajectory optimization, time alignment, and multi-sensor calibration, building systems that perform reliably “in the wild.” This is a great fit if you enjoy combining math, coding, and real-world experimentation to push the limits of state estimation.
Work Packages
- Literature review of work on multi-sensor odometry
- Literature review of work on continuous-time trajectory
- Design a multi-sensor fusion strategy
- Evaluate the performance of the approach in comparison with existing work
Requirements
- Experience with C++ and ROS
References
- [1] Cioffi, G., Cieslewski, T., & Scaramuzza, D., “Continuous-time vs. discrete-time vision-based SLAM: A comparative study”, IEEE Robotics and Automation Letters, 2022.
- [2] Hug, David, Ignacio Alzugaray, and Margarita Chli., “Hyperion–A Fast, Versatile Symbolic Gaussian Belief Propagation Framework for Continuous-Time SLAM”, European Conference on Computer Vision. Cham: Springer Nature Switzerland, 2024.
- [3] C. Zheng et al., “FAST-LIVO2: Fast, Direct LiDAR-Inertial-Visual Odometry”, IEEE Transactions on Robotics, 2024.
- [4] Zhao, Shibo, et al., “Resilient odometry via hierarchical adaptation”, Science Robotics 10.109, 2025.