top of page

Sensor Fusion in Adverse Weather

 Advisor: Dr. Sivakumar Rathinam
2019 -  2021
ZED_Sun.png
FLIR_Sun.png

Fig 1: Motivation: Thermal cameras are impervious to glare from the sun. In the image on the right, the sun simply appears as a bright spot, without the halo.

One of the most important challenges for the autonomous driving industry is to enable vehicles to perceive the environment. This perception task needs to be perform consistently in a wide range of weather conditions which may affect the visible range of different sensor modalities in different ways.

In an attempt to address this, I recently started working with my committee co-chair on a project funded by Safe-D and Texas A&M Transportation institute (Project webpage).

As part of this project, we are trying to combine information from multiple thermal cameras (which have proven to be resilient in fog, low visibility as well as in high glare/direct sunlight scenarios) and associate the objects detected by the cameras with those picked up by on-board radars. This will help us identify the type, relative motion and estimated trajectories of objects in the vehicle's environment.

My role in the project so far has been to first cascade the cameras to create a single wide-angle panorama and perform object detection. The first part has been completed by modifying FLIR ADK ros package from AutonomouStuff and using a Python script to transform/stitch the images together with OpenCV.

ThermalPanoMount.jpg

Fig 2: Multi camera mount designed and fabricated using a 3D printer

For reference, the data below was collected in twilight, shortly past sunset:

ThermalPano2.PNG

Fig 3: Transformed and stitched thermal panorama (Total 190 degree FOV)

I retrained the yolo_v3 algorithm on the FLIR ADAS dataset (after editing all the images and annotations in the dataset to a 5:1 aspect ratio) for object detection, as seen above.

bottom of page