Stochastic Mirror Descent for Low-Rank Tensor Decomposition Under Non-Euclidean Losses. (arXiv:2104.14562v1 [stat.ML])

This work considers low-rank canonical polyadic decomposition (CPD) under a
class of non-Euclidean loss functions that frequently arise in statistical
machine learning and signal processing. These loss functions are often used for
certain types of tensor data, e.g., count and binary tensors, where the least
squares loss is considered unnatural.Compared to the least squares loss, the
non-Euclidean losses are generally more challenging to handle. Non-Euclidean
CPD has attracted considerable interests and a number of prior works exist.
However, pressing computational and theoretical challenges, such as scalability
and convergence issues, still remain. This work offers a unified stochastic
algorithmic framework for large-scale CPD decomposition under a variety of
non-Euclidean loss functions. Our key contribution lies in a tensor fiber
sampling strategy-based flexible stochastic mirror descent framework.
Leveraging the sampling scheme and the multilinear algebraic structure of
low-rank tensors, the proposed lightweight algorithm ensures global convergence
to a stationary point under reasonable conditions. Numerical results show that
our framework attains promising non-Euclidean CPD performance. The proposed
framework also exhibits substantial computational savings compared to
state-of-the-art methods.



Related post