Stable recovery of the factors from a deep matrix product
Résumé
We study a deep matrix factorization problem. It takes as input the matrix $X$ obtained by multiplying $K$ matrices (called factors) and aims at recovering the factors. When $K=1$, this is the usual compressed sensing framework; $K=2$: Examples of applications are dictionary learning, blind deconvolution, self-calibration; $K\geq 3$: can be applied to many fast transforms (such as the FFT). In particular, we apply the theorems to deep convolutional network.
Using a Lifting, we provide : a necessary and sufficient conditions for the identifiability of the factors (up to a scale indeterminacy); - an analogue of the Null-Space-Property, called the Deep-Null-Space-Property which is necessary and sufficient to guarantee the stable recovery of the factors.
Origine | Fichiers produits par l'(les) auteur(s) |
---|