Machine Learning-based Variance Analysis of Brightness Temperature in Simulated Satellite Footprints
Abstract
This study investigates the variance in brightness temperature (BT) within simulated satellite footprints for Observing System Simulation Experiments (OSSE), focusing specifically on Channels 5 and 11 of the Advanced Microwave Sounding Unit (AMSU-A). High-resolution atmospheric simulations from the DYnamics of the Atmospheric general circulation Modeled On Non-hydrostatic Domains (DYAMOND) dataset were utilized to generate brightness temperature data using the Python interface for the Community Radiative Transfer Model (PyCRTM). A computational design map incorporating Random Forest and Association Rule Mining was employed to identify and validate key atmospheric variables influencing BT variance. This ensemble approach facilitated a deeper understanding of atmospheric variability across the coast of Greenland, the Arctic face, and the East Pacific regions. Results highlighted that surface skin temperature and wind velocities significantly influence BT variance, particularly in lower atmospheric layers (Channel 5), while upper atmospheric temperature variance showed prominence in higher layers (Channel 11). The findings highlight the utility of machine learning methodologies to improve accuracy in radiance simulations. This methodology provides a transferable framework for geospatial variance analysis applicable to diverse environmental monitoring and sustainability applications.
Keywords
brightness temperature, radiance simulation, High- Resolution Atmospheric Modeling, Observing System Simulation Experiments (OSSE)
Document Type
Paper
Start Date
19-6-2025 9:10 AM
End Date
19-6-2025 10:30 AM
DOI
10.5703/1288284317902
Recommended Citation
Kulkarni, Chhaya R.; Prive, Nikki; and Janeja, Vandana P., "Machine Learning-based Variance Analysis of Brightness Temperature in Simulated Satellite Footprints" (2025). I-GUIDE Forum. 2.
https://docs.lib.purdue.edu/iguide/2025/presentations/2
Machine Learning-based Variance Analysis of Brightness Temperature in Simulated Satellite Footprints
This study investigates the variance in brightness temperature (BT) within simulated satellite footprints for Observing System Simulation Experiments (OSSE), focusing specifically on Channels 5 and 11 of the Advanced Microwave Sounding Unit (AMSU-A). High-resolution atmospheric simulations from the DYnamics of the Atmospheric general circulation Modeled On Non-hydrostatic Domains (DYAMOND) dataset were utilized to generate brightness temperature data using the Python interface for the Community Radiative Transfer Model (PyCRTM). A computational design map incorporating Random Forest and Association Rule Mining was employed to identify and validate key atmospheric variables influencing BT variance. This ensemble approach facilitated a deeper understanding of atmospheric variability across the coast of Greenland, the Arctic face, and the East Pacific regions. Results highlighted that surface skin temperature and wind velocities significantly influence BT variance, particularly in lower atmospheric layers (Channel 5), while upper atmospheric temperature variance showed prominence in higher layers (Channel 11). The findings highlight the utility of machine learning methodologies to improve accuracy in radiance simulations. This methodology provides a transferable framework for geospatial variance analysis applicable to diverse environmental monitoring and sustainability applications.