Advanced search
1 file | 5.38 MB Add to list

Tensor completion using bilayer multimode low-rank prior and total variation

Author
Organization
Project
Abstract
In this article, we propose a novel bilayer low-rankness measure and two models based on it to recover a low-rank (LR) tensor. The global low rankness of underlying tensor is first encoded by LR matrix factorizations (MFs) to the all-mode matricizations, which can exploit multiorientational spectral low rankness. Presumably, the factor matrices of all-mode decomposition are LR, since local low-rankness property exists in within-mode correlation. In the decomposed subspace, to describe the refined local LR structures of factor/subspace, a new low-rankness insight of subspace: a double nuclear norm scheme is designed to explore the so-called second-layer low rankness. By simultaneously representing the bilayer low rankness of the all modes of the underlying tensor, the proposed methods aim to model multiorientational correlations for arbitrary N-way (N = 3) tensors. A block successive upper-bound minimization (BSUM) algorithm is designed to solve the optimization problem. Subsequence convergence of our algorithms can be established, and the iterates generated by our algorithms converge to the coordinatewise minimizers in some mild conditions. Experiments on several types of public datasets show that our algorithm can recover a variety of LR tensors from significantly fewer samples than its counterparts.
Keywords
Artificial Intelligence, Computer Networks and Communications, Computer Science Applications, Software, Bilayer low rank (LR), higher-order tensor, multimode decomposition, total variation (TV), NUCLEAR NORM, MATRIX COMPLETION, ALGORITHM, MINIMIZATION, MODELS

Downloads

  • Tensor Completion Using Bilayer Multimode Low-Rank Prior and Total Variation.pdf
    • full text (Accepted manuscript)
    • |
    • open access
    • |
    • PDF
    • |
    • 5.38 MB

Citation

Please use this url to cite or link to this publication:

MLA
Zeng, Haijin, et al. “Tensor Completion Using Bilayer Multimode Low-Rank Prior and Total Variation.” IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, vol. 35, no. 10, 2024, pp. 13297–311, doi:10.1109/tnnls.2023.3266841.
APA
Zeng, H., Huang, S., Chen, Y., Liu, S., Luong, H., & Philips, W. (2024). Tensor completion using bilayer multimode low-rank prior and total variation. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 35(10), 13297–13311. https://doi.org/10.1109/tnnls.2023.3266841
Chicago author-date
Zeng, Haijin, Shaoguang Huang, Yongyong Chen, Sheng Liu, Hiep Luong, and Wilfried Philips. 2024. “Tensor Completion Using Bilayer Multimode Low-Rank Prior and Total Variation.” IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 35 (10): 13297–311. https://doi.org/10.1109/tnnls.2023.3266841.
Chicago author-date (all authors)
Zeng, Haijin, Shaoguang Huang, Yongyong Chen, Sheng Liu, Hiep Luong, and Wilfried Philips. 2024. “Tensor Completion Using Bilayer Multimode Low-Rank Prior and Total Variation.” IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 35 (10): 13297–13311. doi:10.1109/tnnls.2023.3266841.
Vancouver
1.
Zeng H, Huang S, Chen Y, Liu S, Luong H, Philips W. Tensor completion using bilayer multimode low-rank prior and total variation. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS. 2024;35(10):13297–311.
IEEE
[1]
H. Zeng, S. Huang, Y. Chen, S. Liu, H. Luong, and W. Philips, “Tensor completion using bilayer multimode low-rank prior and total variation,” IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, vol. 35, no. 10, pp. 13297–13311, 2024.
@article{01H0QG1N09BFA1QRW2JJ50WGQR,
  abstract     = {{In this article, we propose a novel bilayer low-rankness measure and two models based on it to recover a low-rank (LR) tensor. The global low rankness of underlying tensor is first encoded by LR matrix factorizations (MFs) to the all-mode matricizations, which can exploit multiorientational spectral low rankness. Presumably, the factor matrices of all-mode decomposition are LR, since local low-rankness property exists in within-mode correlation. In the decomposed subspace, to describe the refined local LR structures of factor/subspace, a new low-rankness insight of subspace: a double nuclear norm scheme is designed to explore the so-called second-layer low rankness. By simultaneously representing the bilayer low rankness of the all modes of the underlying tensor, the proposed methods aim to model multiorientational correlations for arbitrary N-way (N = 3) tensors. A block successive upper-bound minimization (BSUM) algorithm is designed to solve the optimization problem. Subsequence convergence of our algorithms can be established, and the iterates generated by our algorithms converge to the coordinatewise minimizers in some mild conditions. Experiments on several types of public datasets show that our algorithm can recover a variety of LR tensors from significantly fewer samples than its counterparts.}},
  author       = {{Zeng, Haijin and Huang, Shaoguang and Chen, Yongyong and Liu, Sheng and Luong, Hiep and Philips, Wilfried}},
  issn         = {{2162-237X}},
  journal      = {{IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS}},
  keywords     = {{Artificial Intelligence,Computer Networks and Communications,Computer Science Applications,Software,Bilayer low rank (LR),higher-order tensor,multimode decomposition,total variation (TV),NUCLEAR NORM,MATRIX COMPLETION,ALGORITHM,MINIMIZATION,MODELS}},
  language     = {{eng}},
  number       = {{10}},
  pages        = {{13297--13311}},
  title        = {{Tensor completion using bilayer multimode low-rank prior and total variation}},
  url          = {{http://doi.org/10.1109/tnnls.2023.3266841}},
  volume       = {{35}},
  year         = {{2024}},
}

Altmetric
View in Altmetric
Web of Science
Times cited: