Multilinear discriminant analysis using tensor-tensor products
Résumé
Multilinear Discriminant Analysis (MDA) is a powerful dimension reduction method specifically formulated to deal with tensor data. Precisely, the goal of MDA is to find mode-specific projections that optimally separate tensor data from different classes. However, to solve this task, standard MDA methods use alternating optimization heuristics involving the computation of a succession of tensor-matrix products. Such approaches are most of the time difficult to solve and not natural, highligthing the difficulty to formulate this problem in fully tensor form. In this paper, we propose to solve multilinear discriminant analysis (MDA) by using the concept of transform domain (TD) recently proposed in [15]. We show here that moving MDA to this specific transform domain make its resolution easier and more natural. More precisely, each frontal face of the transformed tensor is processed independently to build a separate optimization sub-problems easier to solve. Next, the obtained solutions are converted into projective tensors by inverse transform. By considering a large number of experiments, we show the effectiveness of our approach with respect to existing MDA methods.