Majorization-Minimization on the Stiefel Manifold With Application to Robust Sparse PCA - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue IEEE Transactions on Signal Processing Année : 2021

Majorization-Minimization on the Stiefel Manifold With Application to Robust Sparse PCA

Sandeep Kumar
Ying Sun
Daniel Palomar

Résumé

This paper proposes a framework for optimizing cost functions of orthonormal basis learning problems, such as principal component analysis (PCA), subspace recovery, orthogonal dictionary learning, etc. The optimization algorithm is derived using the majorization-minimization framework in conjunction with orthogonal projection reformulations to deal with the orthonormality constraint in a systematic manner. In this scope, we derive surrogate functions for various standard objectives that can then be used as building blocks, with examples for robust learning costs and sparsity enforcing penalties. To illustrate this point, we propose a new set of algorithms for sparse PCA driven by this methodology, whose objective function is composed of an M-estimation type subspace fitting term plus a regularizer that promotes sparsity. Simulations and experiments on real data illustrate the interest of the proposed approach, both in terms of performance and computational complexity.
Fichier principal
Vignette du fichier
J15_TSP21c.pdf (640.83 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04313479 , version 1 (29-11-2023)

Identifiants

Citer

Arnaud Breloy, Sandeep Kumar, Ying Sun, Daniel Palomar. Majorization-Minimization on the Stiefel Manifold With Application to Robust Sparse PCA. IEEE Transactions on Signal Processing, 2021, 69, pp.1507-1520. ⟨10.1109/TSP.2021.3058442⟩. ⟨hal-04313479⟩
1 Consultations
4 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More