Abstract

We use the high resolution North American Regional Analysis (NARR) dataset to build for the United States a Temperature Change Index (TCI) based on four contributing variables derived from the layer-averaged temperature and lapse rate of the 1000mb - 700mb layer (near-surface to 3000 meters) for the 1979-2008 period. The analysis uses Geographic Information Systems (GIS) methods to identify distinct regional patterns based on aggregate temperature trends and variability scores. The resulting index allows us to identify and compare regions that experience high (low) temperature trends and variability that are referred to as hot spots (cold spots). The upper Midwest emerges as the region that experiences the largest increases and variability, due to the large magnitude of variability and trends of all variables. In contrast, the lowest TCI scores are observed over southeastern regions and the Rocky Mountains.

Regarding landscape characteristics, high TCI scores occur mostly over agricultural lands (thus implying the problem of temperature variability-dependant crop yields) while low scores generally prevail over forests.

At a seasonal time scale, the largest and most contrasting TCI scores occur during the winter and, to a lesser extent, fall seasons. All variables used to build the TCI show well defined seasonal patterns and differences, especially between winter and summer.

Our method, based on the use of thickness layers, provides a more complete analysis than methods based on monolevel data and confirms that temperature is a robust component of climate change in general and must be included in any study that deals with vulnerability assessment of climate change risks.

Keywords

Temperature Change Index, Lapse rate, GIS, weighted overlay, hot spot, cold spot

Session

Poster

Date of this Version

11-26-2009