The radiance measured by the Land Satellite (Landsat) multispectral scanner (MSS) in a given channel I, where I = 1,2,3,4, is determined primarily by four quantities:
1. The reflectance ρI of the target (i.e., the element of the Earth's surface in the field of view),
2. The solar zenith angle θ,
3. The haze level τH in the atmosphere, and
4. The average reflectance ρI of the adjacent areas of the Earth's surface outside the field of view.
In this paper the haze level τH is defined as the haze optical depth* at wavelength 0.5 µm. The haze optical depth at other wavelengths X is denoted τH(λ). Normally, in the analysis of Landsat data one wishes to classify certain objects on the Earth's surface on the basis of their reflectance ρI. These objects may be in the same Landsat image or may be in several different images separated in space and time. Variations in θ, τH, ρI within a scene or from one scene to another change the data and therefore reduce classification accuracy.
This paper describes a method for simulating the effects of such variations and correcting for them. Simulation and correction are really the same process since correction consists of simulating the MSS response if the value of the Sun angle, haze level, or background reflectance were different from the actual values.
Date of this Version