Forward Simulation of Multi-Frequency Microwave Brightness Temperature over Desert Soils in Kuwait and Comparison with Satellite Observations
MetadataShow full item record
In this study, we address the variations of bare soil surface microwave brightness temperatures and evaluate the performance of a dielectric mixing model over the desert of Kuwait. We use data collected in a field survey and data obtained from NASA Soil Moisture Active Passive (SMAP), European Space Agency Soil Moisture and Ocean Salinity (SMOS), Advanced Microwave Scanning Radiometer 2 (AMSR2), and Special Sensor Microwave/Imager (SSM/I). In situ measurements are collected during two intensive field campaigns over bare, flat, and homogeneous soil terrains in the desert of Kuwait. Despite the prevailing dry desert environment, a large range of soil moisture values was monitored, due to precedent rain events and subsequent dry down. The mean relative difference (MRD) is within the range of ±0.005 m3·m−3 during the two sampling days. This reflects consistency of soil moisture in space and time. As predicted by the model, the higher frequency channels (18 to 19 GHz) demonstrate reduced sensitivity to surface soil moisture even in the absence of vegetation, topography and heterogeneity. In the 6.9 to 10.7 GHz range, only the horizontal polarization is sensitive to surface soil moisture. Instead, at the frequency of 1.4 GHz, both polarizations are sensitive to soil moisture and span a large dynamic range as predicted by the model. The error statistics of the difference between observed satellite brightness temperature (Tb) (excluding SMOS data due to radio frequency interference, RFI) and simulated brightness temperatures (Tbs) show values of Root Mean Square Deviation (RMSD) of 5.05 K at vertical polarization and 4.88 K at horizontal polarization. Such error could be due to the performance of the dielectric mixing model, soil moisture sampling depth and the impact of parametrization of effective temperature and roughness.