Detection of linear trends in multisensor time series in the presence of auto- correlated noise: application to the chlorophyll-a SeaWiFS and MERIS datasets and extrapolation to the incoming Sentinel 3 - OLCI mission
Résumé
The detection of long-term trends in geophysical time series is a key issue in climate change studies. This detection is affected by many factors: the amplitude of the trend to be detected, the length of the available datasets, and the noise properties. Although the auto-correlation observed in geophysical time series does not bias the trend estimate, it affects the estimation of its uncertainty and consequently the ability to detect, or not, a significant trend. Ignoring the auto-correlation level typically leads to an over-detection of significant trends. Satellite time series have been providing remote observations of the sea surface for several decades. Due to satellite lifetime, usually between 5 and 10 years, these time series do not cover the same period and are acquired by different sensors with different characteristics. These differences lead to unknown level shifts (biases) between the datasets, which affect the trend detection. We propose here a generic framework to address the detectability of a linear trend and its significance from multi-sensor datasets.