Random Tensor Theory for Tensor Decomposition - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2022

Random Tensor Theory for Tensor Decomposition

Résumé

We propose a new framework for tensor decomposition based on trace invariants, which are particular cases of tensor networks. In general, tensor networks are diagrams/graphs that specify a way to "multiply" a collection of tensors together to produce another tensor, matrix or scalar. The particularity of trace invariants is that the operation of multiplying copies of a certain input tensor that produces a scalar obeys specific symmetry constraints. In other words, the scalar resulting from this multiplication is invariant under some specific transformations of the involved tensor. We focus our study on the O(N)-invariant graphs, i.e. invariant under orthogonal transformations of the input tensor. The proposed approach is novel and versatile since it allows to address different theoretical and practical aspects of both CANDECOMP/PARAFAC (CP) and Tucker decomposition models. In particular we obtain several results: (i) we generalize the computational limit of Tensor PCA (a rank-one tensor decomposition) to the case of a tensor with axes of different dimensions (ii) we introduce new algorithms for both decomposition models (iii) we obtain theoretical guarantees for these algorithms and (iv) we show improvements with respect to state of the art on synthetic and real data which also highlights a promising potential for practical applications.

Mots clés

Dates et versions

hal-03996669 , version 1 (20-02-2023)

Identifiants

Citer

Mohamed Ouerfelli, Mohamed Tamaazousti, Vincent Rivasseau. Random Tensor Theory for Tensor Decomposition. 36th AAAI Conference on Artificial Intelligence, Feb 2022, Online, United States. pp.7913-7921, ⟨10.1609/aaai.v36i7.20761⟩. ⟨hal-03996669⟩
23 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More