Computational Imaging Through Atmospheric Turbulence

Nicholas Chimitt, Purdue University

Abstract

Imaging at range for the purposes of biometric, scientific, or militaristic applications often suffer due to degradations by the atmosphere. These degradations, due to the non-uniformity of the atmospheric medium, can be modeled as being caused by turbulence. Dating back to the days of Kolmogorov in the 1940s, the field has had many successes in modeling and some in mitigating the effects of turbulence in images. Today, modern restoration methods are often in the form of learning-based solutions which require a large amount of training data. This places atmospheric turbulence mitigation at an interesting point in its history; simulators which accurately capture the effects of the atmosphere were developed without any consideration of deep learning methods and are often missing critical requirements for todays solutions.In this work, we describe a simulator which is not only fast and accurate but has the additional property of being end-to-end differentiable, allowing for end-to-end training with a reconstruction network. This simulation, which we refer to as Zernike-based simulation, performs at a similar level of accuracy as its purely optics-based simulation counterparts while being up to 1000x faster. To achieve this we combine theoretical developments, engineering efforts, and learning-based solutions. Our Zernike-based simulation not only aids in the application of modern solutions to this classical problem but also opens the field to new possibilities with what we refer to as computational image formation.

Degree

Ph.D.

Advisors

Chan, Purdue University.

Subject Area

Electrical engineering|Remote sensing

Off-Campus Purdue Users:
To access this dissertation, please log in to our
proxy server
.

Share

COinS