
Tensors for Data Processing
Theory, Methods, and Applications
- 1st Edition - October 21, 2021
- Imprint: Academic Press
- Editor: Yipeng Liu
- Language: English
- Paperback ISBN:9 7 8 - 0 - 1 2 - 8 2 4 4 4 7 - 0
- eBook ISBN:9 7 8 - 0 - 3 2 3 - 8 5 9 6 5 - 3
Tensors for Data Processing: Theory, Methods and Applications presents both classical and state-of-the-art methods on tensor computation for data processing, covering computati… Read more

Purchase options

Institutional subscription on ScienceDirect
Request a sales quoteTensors for Data Processing: Theory, Methods and Applications presents both classical and state-of-the-art methods on tensor computation for data processing, covering computation theories, processing methods, computing and engineering applications, with an emphasis on techniques for data processing. This reference is ideal for students, researchers and industry developers who want to understand and use tensor-based data processing theories and methods.
As a higher-order generalization of a matrix, tensor-based processing can avoid multi-linear data structure loss that occurs in classical matrix-based data processing methods. This move from matrix to tensors is beneficial for many diverse application areas, including signal processing, computer science, acoustics, neuroscience, communication, medical engineering, seismology, psychometric, chemometrics, biometric, quantum physics and quantum chemistry.
- Provides a complete reference on classical and state-of-the-art tensor-based methods for data processing
- Includes a wide range of applications from different disciplines
- Gives guidance for their application
- Cover image
- Title page
- Table of Contents
- Copyright
- List of contributors
- Preface
- Chapter 1: Tensor decompositions: computations, applications, and challenges
- Abstract
- 1.1. Introduction
- 1.2. Tensor operations
- 1.3. Tensor decompositions
- 1.4. Tensor processing techniques
- 1.5. Challenges
- References
- Chapter 2: Transform-based tensor singular value decomposition in multidimensional image recovery
- Abstract
- 2.1. Introduction
- 2.2. Recent advances of the tensor singular value decomposition
- 2.3. Transform-based t-SVD
- 2.4. Numerical experiments
- 2.5. Conclusions and new guidelines
- References
- Chapter 3: Partensor
- Abstract
- Acknowledgement
- 3.1. Introduction
- 3.2. Tensor decomposition
- 3.3. Tensor decomposition with missing elements
- 3.4. Distributed memory implementations
- 3.5. Numerical experiments
- 3.6. Conclusion
- References
- Chapter 4: A Riemannian approach to low-rank tensor learning
- Abstract
- 4.1. Introduction
- 4.2. A brief introduction to Riemannian optimization
- 4.3. Riemannian Tucker manifold geometry
- 4.4. Algorithms for tensor learning problems
- 4.5. Experiments
- 4.6. Conclusion
- References
- Chapter 5: Generalized thresholding for low-rank tensor recovery: approaches based on model and learning
- Abstract
- 5.1. Introduction
- 5.2. Tensor singular value thresholding
- 5.3. Thresholding based low-rank tensor recovery
- 5.4. Generalized thresholding algorithms with learning
- 5.5. Numerical examples
- 5.6. Conclusion
- References
- Chapter 6: Tensor principal component analysis
- Abstract
- 6.1. Introduction
- 6.2. Notations and preliminaries
- 6.3. Tensor PCA for Gaussian-noisy data
- 6.4. Tensor PCA for sparsely corrupted data
- 6.5. Tensor PCA for outlier-corrupted data
- 6.6. Other tensor PCA methods
- 6.7. Future work
- 6.8. Summary
- References
- Chapter 7: Tensors for deep learning theory
- Abstract
- 7.1. Introduction
- 7.2. Bounding a function's expressivity via tensorization
- 7.3. A case study: self-attention networks
- 7.4. Convolutional and recurrent networks
- 7.5. Conclusion
- References
- Chapter 8: Tensor network algorithms for image classification
- Abstract
- 8.1. Introduction
- 8.2. Background
- 8.3. Tensorial extensions of support vector machine
- 8.4. Tensorial extension of logistic regression
- 8.5. Conclusion
- References
- Chapter 9: High-performance tensor decompositions for compressing and accelerating deep neural networks
- Abstract
- 9.1. Introduction and motivation
- 9.2. Deep neural networks
- 9.3. Tensor networks and their decompositions
- 9.4. Compressing deep neural networks
- 9.5. Experiments and future directions
- References
- Chapter 10: Coupled tensor decompositions for data fusion
- Abstract
- Acknowledgements
- 10.1. Introduction
- 10.2. What is data fusion?
- 10.3. Decompositions in data fusion
- 10.4. Applications of tensor-based data fusion
- 10.5. Fusion of EEG and fMRI: a case study
- 10.6. Data fusion demos
- 10.7. Conclusion and prospects
- References
- Chapter 11: Tensor methods for low-level vision
- Abstract
- Acknowledgements
- 11.1. Low-level vision and signal reconstruction
- 11.2. Methods using raw tensor structure
- 11.3. Methods using tensorization
- 11.4. Examples of low-level vision applications
- 11.5. Remarks
- References
- Chapter 12: Tensors for neuroimaging
- Abstract
- 12.1. Introduction
- 12.2. Neuroimaging modalities
- 12.3. Multidimensionality of the brain
- 12.4. Tensor decomposition structures
- 12.5. Applications of tensors in neuroimaging
- 12.6. Future challenges
- 12.7. Conclusion
- References
- Chapter 13: Tensor representation for remote sensing images
- Abstract
- 13.1. Introduction
- 13.2. Optical remote sensing: HSI and MSI fusion
- 13.3. Polarimetric synthetic aperture radar: feature extraction
- References
- Chapter 14: Structured tensor train decomposition for speeding up kernel-based learning
- Abstract
- 14.1. Introduction
- 14.2. Notations and algebraic background
- 14.3. Standard tensor decompositions
- 14.4. Dimensionality reduction based on a train of low-order tensors
- 14.5. Tensor train algorithm
- 14.6. Kernel-based classification of high-order tensors
- 14.7. Experiments
- 14.8. Conclusion
- References
- Index
- Edition: 1
- Published: October 21, 2021
- Imprint: Academic Press
- No. of pages: 596
- Language: English
- Paperback ISBN: 9780128244470
- eBook ISBN: 9780323859653
YL