Multilayer sparsity-based tensor decomposition for low-rank tensor completion
- Author
- Jize Xue, Yongqiang Zhao, Shaoguang Huang (UGent) , Wenzhi Liao (UGent) , Jonathan Cheung-Wai Chan and Seong G. Kong
- Organization
- Abstract
- Existing methods for tensor completion (TC) have limited ability for characterizing low-rank (LR) structures. To depict the complex hierarchical knowledge with implicit sparsity attributes hidden in a tensor, we propose a new multilayer sparsity-based tensor decomposition (MLSTD) for the low-rank tensor completion (LRTC). The method encodes the structured sparsity of a tensor by the multiple-layer representation. Specifically, we use the CANDECOMP/PARAFAC (CP) model to decompose a tensor into an ensemble of the sum of rank-1 tensors, and the number of rank-1 components is easily interpreted as the first-layer sparsity measure. Presumably, the factor matrices are smooth since local piecewise property exists in within-mode correlation. In subspace, the local smoothness can be regarded as the second-layer sparsity. To describe the refined structures of factor/subspace sparsity, we introduce a new sparsity insight of subspace smoothness: a self-adaptive low-rank matrix factorization (LRMF) scheme, called the third-layer sparsity. By the progressive description of the sparsity structure, we formulate an MLSTD model and embed it into the LRTC problem. Then, an effective alternating direction method of multipliers (ADMM) algorithm is designed for the MLSTD minimization problem. Various experiments in RGB images, hyperspectral images (HSIs), and videos substantiate that the proposed LRTC methods are superior to state-of-the-art methods.
- Keywords
- Artificial Intelligence, Computer Networks and Communications, Computer Science Applications, Software, Tensors, Matrix decomposition, Correlation, Nonhomogeneous media, Minimization, Transmission line measurements, Biomedical measurement, CANDECOMP, PARAFAC (CP) decomposition, factor smooth prior, low-rank tensor completion (LRTC), multilayer sparsity (MLS) constraints, subspace structured sparsity
Downloads
-
(...).pdf
- full text (Published version)
- |
- UGent only
- |
- |
- 7.20 MB
-
Accepted VERSION.pdf
- full text (Accepted manuscript)
- |
- open access
- |
- |
- 17.75 MB
Citation
Please use this url to cite or link to this publication: http://hdl.handle.net/1854/LU-8732231
- MLA
- Xue, Jize, et al. “Multilayer Sparsity-Based Tensor Decomposition for Low-Rank Tensor Completion.” IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, vol. 33, no. 11, 2022, pp. 6916–30, doi:10.1109/tnnls.2021.3083931.
- APA
- Xue, J., Zhao, Y., Huang, S., Liao, W., Chan, J. C.-W., & Kong, S. G. (2022). Multilayer sparsity-based tensor decomposition for low-rank tensor completion. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 33(11), 6916–6930. https://doi.org/10.1109/tnnls.2021.3083931
- Chicago author-date
- Xue, Jize, Yongqiang Zhao, Shaoguang Huang, Wenzhi Liao, Jonathan Cheung-Wai Chan, and Seong G. Kong. 2022. “Multilayer Sparsity-Based Tensor Decomposition for Low-Rank Tensor Completion.” IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 33 (11): 6916–30. https://doi.org/10.1109/tnnls.2021.3083931.
- Chicago author-date (all authors)
- Xue, Jize, Yongqiang Zhao, Shaoguang Huang, Wenzhi Liao, Jonathan Cheung-Wai Chan, and Seong G. Kong. 2022. “Multilayer Sparsity-Based Tensor Decomposition for Low-Rank Tensor Completion.” IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 33 (11): 6916–6930. doi:10.1109/tnnls.2021.3083931.
- Vancouver
- 1.Xue J, Zhao Y, Huang S, Liao W, Chan JC-W, Kong SG. Multilayer sparsity-based tensor decomposition for low-rank tensor completion. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS. 2022;33(11):6916–30.
- IEEE
- [1]J. Xue, Y. Zhao, S. Huang, W. Liao, J. C.-W. Chan, and S. G. Kong, “Multilayer sparsity-based tensor decomposition for low-rank tensor completion,” IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, vol. 33, no. 11, pp. 6916–6930, 2022.
@article{8732231, abstract = {{Existing methods for tensor completion (TC) have limited ability for characterizing low-rank (LR) structures. To depict the complex hierarchical knowledge with implicit sparsity attributes hidden in a tensor, we propose a new multilayer sparsity-based tensor decomposition (MLSTD) for the low-rank tensor completion (LRTC). The method encodes the structured sparsity of a tensor by the multiple-layer representation. Specifically, we use the CANDECOMP/PARAFAC (CP) model to decompose a tensor into an ensemble of the sum of rank-1 tensors, and the number of rank-1 components is easily interpreted as the first-layer sparsity measure. Presumably, the factor matrices are smooth since local piecewise property exists in within-mode correlation. In subspace, the local smoothness can be regarded as the second-layer sparsity. To describe the refined structures of factor/subspace sparsity, we introduce a new sparsity insight of subspace smoothness: a self-adaptive low-rank matrix factorization (LRMF) scheme, called the third-layer sparsity. By the progressive description of the sparsity structure, we formulate an MLSTD model and embed it into the LRTC problem. Then, an effective alternating direction method of multipliers (ADMM) algorithm is designed for the MLSTD minimization problem. Various experiments in RGB images, hyperspectral images (HSIs), and videos substantiate that the proposed LRTC methods are superior to state-of-the-art methods.}}, author = {{Xue, Jize and Zhao, Yongqiang and Huang, Shaoguang and Liao, Wenzhi and Chan, Jonathan Cheung-Wai and Kong, Seong G.}}, issn = {{2162-237X}}, journal = {{IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS}}, keywords = {{Artificial Intelligence,Computer Networks and Communications,Computer Science Applications,Software,Tensors,Matrix decomposition,Correlation,Nonhomogeneous media,Minimization,Transmission line measurements,Biomedical measurement,CANDECOMP,PARAFAC (CP) decomposition,factor smooth prior,low-rank tensor completion (LRTC),multilayer sparsity (MLS) constraints,subspace structured sparsity}}, language = {{eng}}, number = {{11}}, pages = {{6916--6930}}, title = {{Multilayer sparsity-based tensor decomposition for low-rank tensor completion}}, url = {{http://doi.org/10.1109/tnnls.2021.3083931}}, volume = {{33}}, year = {{2022}}, }
- Altmetric
- View in Altmetric
- Web of Science
- Times cited: