First Analysis of SMOS Sea Surface Salinity
Résumé
SMOS (Soil Moisture and Ocean Salinity) is the first interferometric radiometer in orbit. In the first ground processing (Level 1), an image reconstruction algorithm is applied which yields measured brightness temperatures (TB). Preliminary studies have shown that this processing is critical and likely to introduce biases that affect subsequent processing. Therefore, a comparison of modelled to reconstructed TB is essential. Homogenous ocean surfaces far from land masses are ideal for this task as the TB variation with the satellite geometry (incidence angle) is relatively well known. Extensive comparisons were conducted between SMOS Level 1c TB and TB simulated using the default forward model implemented in the ESA SMOS ocean salinity processing and using ECMWF (European Centre for Medium-Range Weather Forecast) forcings. They demonstrate that the North-South behavior of SMOS measurements over the ocean is amazingly consistent with the simulated L-band signal, that the noise of the measurements with respect to the model estimate depends on the measurement location in the field of view and is very close to the expected radiometric uncertainty. On another hand, systematic biases of several Kelvins are observed, that depend on the location of the measurement in the Field of View. After these systematic biases are removed, SMOS sea surface salinity (SSS) are retrieved over 5 days in March 2010 at global scale. They match quite well the SSS climatology; SMOS anomalies with respect to the climatology are finally compared to the ones deduced from in situ measurements
Domaines
Planète et Univers [physics]Origine | Fichiers éditeurs autorisés sur une archive ouverte |
---|