Skip to content

Multi-sensor Fusion for Navigation and Mapping (2009-2014)

    Current robotic maps are limited by decades-old range sensing technology. Only multisensor (LIDAR, camera, RADAR and multispectral) approaches can provide the density and quality of data required for automated inspection, operation and science. My PhD research explored synergistic cooperation of multi-modal optical sensing to enhance understanding of geometry (super-resolution), location of interest sampling (image-directed scanning) and material understanding from source-motion.