Tensor-based framework for training flexible neural networks - Archive ouverte HAL Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2021

Tensor-based framework for training flexible neural networks

Résumé

Activation functions (AFs) are an important part of the design of neural networks (NNs), and their choice plays a predominant role in the performance of a NN. In this work, we are particularly interested in the estimation of flexible activation functions using tensor-based solutions, where the AFs are expressed as a weighted sum of predefined basis functions. To do so, we propose a new learning algorithm which solves a constrained coupled matrix-tensor factorization (CMTF) problem. This technique fuses the first and zeroth order information of the NN, where the first-order information is contained in a Jacobian tensor, following a constrained canonical polyadic decomposition (CPD). The proposed algorithm can handle different decomposition bases. The goal of this method is to compress large pretrained NN models, by replacing subnetworks, {\em i.e.,} one or multiple layers of the original network, by a new flexible layer. The approach is applied to a pretrained convolutional neural network (CNN) used for character classification.

Dates et versions

hal-03273321 , version 1 (29-06-2021)

Identifiants

Citer

Yassine Zniyed, Konstantin Usevich, Sebastian Miron, David Brie. Tensor-based framework for training flexible neural networks. 2021. ⟨hal-03273321⟩
62 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More