Does a general structure exist for adaptation/learning algorithms?
Résumé
There are many parameter adaptation/learning algorithms (PALA) used in adaptive control, system identification and neural networks (Nesterov, Conjugate gradients, Momentum back propagation, Averaged gradient, Inte-gral+proportional+derivative, ...). For most of these algorithms unfortunately there are no results available for the choice of the various coefficients (weights) allowing to guarantee the stability of the parameter estimator for any value of the learning rate and for any initial conditions. All these algorithms are in fact particular cases of a general structure for the PALA which is introduced in this paper. This structure is characterized by the presence of an embedded ARMA (Auto Regressive Moving Average) filter. Taking into account the inherent feedback structure of these adaptation/learning algorithms, the passivity approach is used for addressing the stability issue. Conditions which will assure the stability of this general structure will be provided and then particularized for the specific algorithms described in the paper. The impact of the MA and AR terms of the embedded filter upon the performance of the algorithms will be emphasized through simulation.
Domaines
AutomatiqueOrigine | Fichiers produits par l'(les) auteur(s) |
---|