Document Type

Article

Abstract

Regional-scale estimation of soil moisture using in situ field observations is not possible due to problemswith the representativeness of the sampling and costs. Remotely sensed satellite data are helpful in this regard.Here, the simulations of 19- and 37-GHz vertical and horizontal polarization brightness temperatures and estimationof soil moistures using data from the Special Sensor Microwave/Imager (SSM/I) for 798 0.258 3 0.258boxes in the southwestern plains region of the United States for the time period between 1 August 1987 and31 July 1988 are presented. A coupled land-canopy–atmosphere model is used for simulating the brightnesstemperatures. The land-surface hydrology is modeled using a thin-layer hydrologic model. The canopy scatteringis modeled using a radiative transfer model, and the atmospheric attenuation is characterized using an empiricalmodel. The simulated brightness temperatures are compared with those observed by the SSM/I sensor aboardthe Defense Metereological Satellite Program satellite. The observed brightness temperatures are used to derivethe soil moistures using the canopy radiative transfer and atmospheric attenuation model. The discrepanciesbetween the SSM/I-based estimates and the simulated soil moisture are discussed. The mean monthly soilmoistures estimated using the 19-GHz SSM/I brightness temperature data are interpreted along with the meanmonthly leaf area index and accumulated rainfall. The soil moistures estimated using the 19-GHz SSM/I dataare used in conjunction with the hydrologic model to estimate cumulative monthly evaporation. The results ofthe simulations hold promise for the utilization of microwave brightness temperatures in hydrologic modelingfor soil moisture estimation.

Share

COinS