Skip to content

Extreme Perception

    My research seeks to improve the performance of computer vision in extreme environments by understanding and exploiting the physics of light transport. It is well-known that use of active illumination in the form of “structured light” can greatly enhance image understanding; however, accurately modeled natural illumination can also be leveraged in the same manner. My dissertation explored new ways to integrate computer vision, lighting, and photometry to tackle the most challenging perception environments. I have continued this approach in my professional career at NASA by leading navigation efforts on the VIPER and Resource Prospector missions, targeted to the polar regions of the Moon. I also led appearance modeling efforts on several of NASA’s photorealistic simulators for the Moon and Europa.

    Shape from Shading with Active Illumination from a Mobile Robot (2009) Streamflow RiOS payload measures the surface flow of the Sacramento River from a UAV with thermal-visual sensing (2023) VIPER Rover lights up calibration targets with blue navigation lights during clean room testing (2024) Lighting conditions at the Lunar poles with terrain constructed in the lab using LHS-1 regolith simulant (2022) Testing the ICICLES thermal-visual mapping payload for automated landing site selection at Lake Tahoe (2019) Panorama of Death Valley Icy Moon Analog Terrain (2019)