Implicit Regularization in Deep Tensor Factorization - Archive ouverte HAL Access content directly
Conference Papers Year : 2021

Implicit Regularization in Deep Tensor Factorization

Abstract

Attempts of studying implicit regularization associated to gradient descent (GD) have identified matrix completion as a suitable test-bed. Late findings suggest that this phenomenon cannot be phrased as a minimization-norm problem, implying that a paradigm shift is required and that dynamics has to be taken into account. In the present work we address the more general setup of tensor completion by leveraging two popularized tensor factorization, namely Tucker and TensorTrain (TT). We track relevant quantities such as tensor nuclear norm, effective rank, generalized singular values and we introduce deep Tucker and TT unconstrained factorization to deal with the completion task. Experiments on both synthetic and real data show that gradient descent promotes solution with low-rank, and validate the conjecture saying that the phenomenon has to be addressed from a dynamical perspective.
Fichier principal
Vignette du fichier
main.pdf (558.1 Ko) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-03211964 , version 1 (01-05-2021)

Identifiers

Cite

Paolo Milanesi, Hachem Kadri, Stéphane Ayache, Thierry Artières. Implicit Regularization in Deep Tensor Factorization. International Joint Conference on Neural Networks (IJCNN), Jul 2021, Online, China. ⟨hal-03211964⟩
120 View
135 Download

Altmetric

Share

Gmail Facebook X LinkedIn More