Low-Rank Tensor Completion Based on Self-Adaptive Learnable Transforms
Autor: | Tongle Wu, Bin Gao, Jicong Fan, Jize Xue, W. L. Woo |
---|---|
Rok vydání: | 2022 |
Předmět: | |
Zdroj: | IEEE Transactions on Neural Networks and Learning Systems. :1-13 |
ISSN: | 2162-2388 2162-237X |
DOI: | 10.1109/tnnls.2022.3215974 |
Popis: | The tensor nuclear norm (TNN), defined as the sum of nuclear norms of frontal slices of the tensor in a frequency domain, has been found useful in solving low-rank tensor recovery problems. Existing TNN-based methods use either fixed or data-independent transformations, which may not be the optimal choices for the given tensors. As the consequence, these methods cannot exploit the potential low-rank structure of tensor data adaptively. In this article, we propose a framework called self-adaptive learnable transform (SALT) to learn a transformation matrix from the given tensor. Specifically, SALT aims to learn a lossless transformation that induces a lower average-rank tensor, where the Schatten- p quasi-norm is used as the rank proxy. Then, because SALT is less sensitive to the orientation, we generalize SALT to other dimensions of tensor (SALTS), namely, learning three self-adaptive transformation matrices simultaneously from given tensor. SALTS is able to adaptively exploit the potential low-rank structures in all directions. We provide a unified optimization framework based on alternating direction multiplier method for SALTS model and theoretically prove the weak convergence property of the proposed algorithm. Experimental results in hyperspectral image (HSI), color video, magnetic resonance imaging (MRI), and COIL-20 datasets show that SALTS is much more accurate in tensor completion than existing methods. The demo code can be found at https://faculty.uestc.edu.cn/gaobin/zh_ CN/lwcg/153392/list/index.htm. |
Databáze: | OpenAIRE |
Externí odkaz: |