糖心Vlog

Context-aware perception in adverse conditions

EVENT DATE
20 Aug 2025
Please refer to specific dates for varied timings
TIME
2:00 pm 4:00 pm
LOCATION
SUTD Think Tank 22 (Building 2, Level 3, Room 2.311)

Adverse conditions such as rain or murky water environment, significantly impacts various perception tasks. In rain conditions, the images captured are easily corrupted by both raindrops on the lenses and lens flare. Meanwhile in turbid underwater conditions, the murkiness reduces the contrast and saturation of the image. To tackle these problems, we utilise contextual information to improve robustness of perception algorithms. We highlight the following contributions: (a) robust localisation and tracking for autonomous urban driving in rain, (b) time efficient mapping and scene understanding for coral reef monitoring in murky underwater conditions and (c) validation on hardware constrained mobile robots in adverse conditions for usage in real-world applications.

For autonomous urban driving in rain, we proposed a novel visual odometry (VO) algorithm, GRP-VO, that utilises the global reference path contextual information for better estimation of the vehicle’s position. Improving upon GRP-VO, we propose a novel sensor fusion approach (Map-Fusion) which utilises a more generic contextual information, an open-sourced street network, to perform localisation and tracking. We conduct experiments on 4 datasets ranging from clear weather to heavy rain conditions and different geographical locations. We show that Map-Fusion on average reduces the Absolute Trajectory Error (ATE) by 1.66m for VO and 1.99m for Visual Inertial Odometry (VIO). We also implemented Map-Fusion on an unmanned ground vehicle and validated it in rain conditions on urban roads. Map-Fusion achieved 2.46m ATE in clear weather and 6.05m ATE in rain weather for a 150m route.

For murky underwater conditions, we propose Murky-CHARM: a coral health assessment framework for coral reef monitoring. It tackles both mapping and scene understanding perception tasks. For mapping, we propose Maplion which improves the time efficiency of structure from motion reconstruction algorithms for mapping underwater structures (e.g., corals) in murky environments. Maplion leverages on semantic information in the scene as well as underwater vehicle pose to sample the most informative frames for mapping. It reduces the time taken to map from 16hr to 4min on average while maintaining map quality in terms of viewpoint diversity. Lastly for scene understanding, we also addressed the segmentation tasks in murky underwater environments. We first curated a dataset of segmented corals in murky waters and used it for finetuning Segformer model for coral segmentation. The finetuned model outperforms the pre-trained model by 0.56 mIoU. Both of our proposed mapping and scene understanding algorithms were validated using an underwater vehicle for coral reef monitoring at the coral nursery in Sentosa, Singapore.

Speaker’s profile

Tan Yu Xiang is a final-year PhD candidate in Information Systems Technology and Design at Singapore University of Technology and Design (SUTD), supervised by Dr Malika Meghjani. His research interests include Robust Perception, Computer Vision, and Marine Robotics. His thesis, titled Context-Aware Perception in Adverse Conditions, explores the effects of adverse conditions, such as rain or murky water environment, on various perception tasks. Specifically, it investigates Visual Odometry algorithms for autonomous urban driving in rain and 3D mapping alongside scene understanding for coral reef monitoring. Beyond his academic work, Yu Xiang has contributed to the development and teaching of the Mobile Robotics Course in SUTD.

ADD TO CALENDAR
Google Calendar
Apple Calendar