PERHAPS A GIFT VOUCHER FOR MUM?: MOTHER'S DAY

Close Notification

Your cart does not contain any items

From Algebraic Structures to Tensors

Gérard Favier

$295.95

Hardback

Not in-store but you can order this
How long will it take?

QTY:

English
ISTE Ltd and John Wiley & Sons Inc
25 October 2019
Nowadays, tensors play a central role for the representation, mining, analysis, and fusion of multidimensional, multimodal, and heterogeneous big data in numerous fields.

This set on Matrices and Tensors in Signal Processing aims at giving a self-contained and comprehensive presentation of various concepts and methods, starting from fundamental algebraic structures to advanced tensor-based applications, including recently developed tensor models and efficient algorithms for dimensionality reduction and parameter estimation. Although its title suggests an orientation towards signal processing, the results presented in this set will also be of use to readers interested in other disciplines.

This first book provides an introduction to matrices and tensors of higher-order based on the structures of vector space and tensor space. Some standard algebraic structures are first described, with a focus on the hilbertian approach for signal representation, and function approximation based on Fourier series and orthogonal polynomial series. Matrices and hypermatrices associated with linear, bilinear and multilinear maps are more particularly studied. Some basic results are presented for block matrices. The notions of decomposition, rank, eigenvalue, singular value, and unfolding of a tensor are introduced, by emphasizing similarities and differences between matrices and tensors of higher-order.

Edited by:  
Imprint:   ISTE Ltd and John Wiley & Sons Inc
Country of Publication:   United Kingdom
Dimensions:   Height: 236mm,  Width: 160mm,  Spine: 25mm
Weight:   635g
ISBN:   9781786301543
ISBN 10:   1786301547
Pages:   320
Publication Date:  
Audience:   Professional and scholarly ,  Undergraduate
Format:   Hardback
Publisher's Status:   Active
Preface xi Chapter 1. Historical Elements of Matrices and Tensors 1 Chapter 2. Algebraic Structures 9 2.1. A few historical elements 9 2.2. Chapter summary 11 2.3. Sets 12 2.3.1. Definitions 12 2.3.2. Sets of numbers 13 2.3.3. Cartesian product of sets 13 2.3.4. Set operations 14 2.3.5. De Morgan's laws 15 2.3.6. Characteristic functions 15 2.3.7. Partitions 16 2.3.8. -algebras or -fields 16 2.3.9. Equivalence relations 16 2.3.10. Order relations 17 2.4. Maps and composition of maps 17 2.4.1. Definitions 17 2.4.2. Properties 18 2.4.3. Composition of maps 18 2.5. Algebraic structures 18 2.5.1. Laws of composition 18 2.5.2. Definition of algebraic structures 22 2.5.3. Substructures 24 2.5.4. Quotient structures 24 2.5.5. Groups 24 2.5.6. Rings 27 2.5.7. Fields 32 2.5.8. Modules 33 2.5.9. Vector spaces 33 2.5.10. Vector spaces of linear maps 38 2.5.11. Vector spaces of multilinear maps 39 2.5.12. Vector subspaces 41 2.5.13. Bases 43 2.5.14. Sum and direct sum of subspaces 45 2.5.15. Quotient vector spaces 47 2.5.16. Algebras 47 2.6. Morphisms 49 2.6.1. Group morphisms 49 2.6.2. Ring morphisms 51 2.6.3. Morphisms of vector spaces or linear maps 51 2.6.4. Algebra morphisms 56 Chapter 3. Banach and Hilbert Spaces - Fourier Series and Orthogonal Polynomials 57 3.1. Introduction and chapter summary 57 3.2. Metric spaces 59 3.2.1. Definition of distance 60 3.2.2. Definition of topology 60 3.2.3. Examples of distances 61 3.2.4. Inequalities and equivalent distances 62 3.2.5. Distance and convergence of sequences 62 3.2.6. Distance and local continuity of a function 62 3.2.7. Isometries and Lipschitzian maps 63 3.3. Normed vector spaces 63 3.3.1. Definition of norm and triangle inequalities 63 3.3.2. Examples of norms 64 3.3.3. Equivalent norms 68 3.3.4. Distance associated with a norm 69 3.4. Pre-Hilbert spaces 69 3.4.1. Real pre-Hilbert spaces 70 3.4.2. Complex pre-Hilbert spaces 70 3.4.3. Norm induced from an inner product 72 3.4.4. Distance associated with an inner product 75 3.4.5. Weighted inner products 76 3.5. Orthogonality and orthonormal bases 76 3.5.1. Orthogonal/perpendicular vectors and Pythagorean theorem 76 3.5.2. Orthogonal subspaces and orthogonal complement 77 3.5.3. Orthonormal bases 79 3.5.4. Orthogonal/unitary endomorphisms and isometries 79 3.6. Gram-Schmidt orthonormalization process 80 3.6.1. Orthogonal projection onto a subspace 80 3.6.2. Orthogonal projection and Fourier expansion 80 3.6.3. Bessel's inequality and Parseval's equality 82 3.6.4. Gram-Schmidt orthonormalization process 83 3.6.5. QR decomposition 85 3.6.6. Application to the orthonormalization of a set of functions 86 3.7. Banach and Hilbert spaces 88 3.7.1. Complete metric spaces 88 3.7.2. Adherence, density and separability 90 3.7.3. Banach and Hilbert spaces 91 3.7.4. Hilbert bases 93 3.8. Fourier series expansions 97 3.8.1. Fourier series, Parseval's equality and Bessel's inequality 97 3.8.2. Case of 2 -periodic functions from R to C 97 3.8.3. T-periodic functions from R to C 102 3.8.4. Partial Fourier sums and Bessel's inequality 102 3.8.5. Convergence of Fourier series 103 3.8.6. Examples of Fourier series 108 3.9. Expansions over bases of orthogonal polynomials 117 Chapter 4. Matrix Algebra 123 4.1. Chapter summary 123 4.2. Matrix vector spaces 124 4.2.1. Notations and definitions 124 4.2.2. Partitioned matrices 125 4.2.3. Matrix vector spaces 126 4.3. Some special matrices 127 4.4. Transposition and conjugate transposition 128 4.5. Vectorization 130 4.6. Vector inner product, norm and orthogonality 130 4.6.1. Inner product 130 4.6.2. Euclidean/Hermitian norm 131 4.6.3. Orthogonality 131 4.7. Matrix multiplication 132 4.7.1. Definition and properties 132 4.7.2. Powers of a matrix 134 4.8. Matrix trace, inner product and Frobenius norm 137 4.8.1. Definition and properties of the trace 137 4.8.2. Matrix inner product 138 4.8.3. Frobenius norm 138 4.9. Subspaces associated with a matrix 139 4.10. Matrix rank 141 4.10.1. Definition and properties 141 4.10.2. Sum and difference rank 143 4.10.3. Subspaces associated with a matrix product 143 4.10.4. Product rank 144 4.11. Determinant, inverses and generalized inverses 145 4.11.1. Determinant 145 4.11.2. Matrix inversion 148 4.11.3. Solution of a homogeneous system of linear equations 149 4.11.4. Complex matrix inverse 150 4.11.5. Orthogonal and unitary matrices 150 4.11.6. Involutory matrices and anti-involutory matrices 151 4.11.7. Left and right inverses of a rectangular matrix 153 4.11.8. Generalized inverses 155 4.11.9. Moore-Penrose pseudo-inverse 157 4.12. Multiplicative groups of matrices 158 4.13. Matrix associated to a linear map 159 4.13.1. Matrix representation of a linear map 159 4.13.2. Change of basis 162 4.13.3. Endomorphisms 164 4.13.4. Nilpotent endomorphisms 166 4.13.5. Equivalent, similar and congruent matrices 167 4.14. Matrix associated with a bilinear/sesquilinear form 168 4.14.1. Definition of a bilinear/sesquilinear map 168 4.14.2. Matrix associated to a bilinear/sesquilinear form 170 4.14.3. Changes of bases with a bilinear form 170 4.14.4. Changes of bases with a sesquilinear form 171 4.14.5. Symmetric bilinear/sesquilinear forms 172 4.15. Quadratic forms and Hermitian forms 174 4.15.1. Quadratic forms 174 4.15.2. Hermitian forms 176 4.15.3. Positive/negative definite quadratic/Hermitian forms 177 4.15.4. Examples of positive definite quadratic forms 178 4.15.5. Cauchy-Schwarz and Minkowski inequalities 179 4.15.6. Orthogonality, rank, kernel and degeneration of a bilinear form 180 4.15.7. Gauss reduction method and Sylvester's inertia law 181 4.16. Eigenvalues and eigenvectors 184 4.16.1. Characteristic polynomial and Cayley-Hamilton theorem 184 4.16.2. Right eigenvectors 186 4.16.3. Spectrum and regularity/singularity conditions 187 4.16.4. Left eigenvectors 188 4.16.5. Properties of eigenvectors 188 4.16.6. Eigenvalues and eigenvectors of a regularized matrix 190 4.16.7. Other properties of eigenvalues 190 4.16.8. Symmetric/Hermitian matrices 191 4.16.9. Orthogonal/unitary matrices 193 4.16.10. Eigenvalues and extrema of the Rayleigh quotient 194 4.17. Generalized eigenvalues 195 Chapter 5. Partitioned Matrices 199 5.1. Introduction 199 5.2. Submatrices 200 5.3. Partitioned matrices 201 5.4. Matrix products and partitioned matrices 202 5.4.1. Matrix products 202 5.4.2. Vector Kronecker product 202 5.4.3. Matrix Kronecker product 202 5.4.4. Khatri-Rao product 204 5.5. Special cases of partitioned matrices 205 5.5.1. Block-diagonal matrices 205 5.5.2. Signature matrices 205 5.5.3. Direct sum 205 5.5.4. Jordan forms 206 5.5.5. Block-triangular matrices 206 5.5.6. Block Toeplitz and Hankel matrices 207 5.6. Transposition and conjugate transposition 207 5.7. Trace 208 5.8. Vectorization 208 5.9. Blockwise addition 208 5.10. Blockwise multiplication 209 5.11. Hadamard product of partitioned matrices 209 5.12. Kronecker product of partitioned matrices 210 5.13. Elementary operations and elementary matrices 212 5.14. Inversion of partitioned matrices 214 5.14.1. Inversion of block-diagonal matrices 215 5.14.2. Inversion of block-triangular matrices 215 5.14.3. Block-triangularization and Schur complements 216 5.14.4. Block-diagonalization and block-factorization 216 5.14.5. Block-inversion and partitioned inverse 217 5.14.6. Other formulae for the partitioned 2 x 2 inverse 218 5.14.7. Solution of a system of linear equations 219 5.14.8. Inversion of a partitioned Gram matrix 220 5.14.9. Iterative inversion of a partitioned square matrix 220 5.14.10. Matrix inversion lemma and applications 221 5.15. Generalized inverses of 2 x 2 block matrices 222 5.16. Determinants of partitioned matrices 224 5.16.1. Determinant of block-diagonal matrices 224 5.16.2. Determinant of block-triangular matrices 225 5.16.3. Determinant of partitioned matrices with square diagonal blocks 225 5.16.4. Determinants of specific partitioned matrices 226 5.16.5. Eigenvalues of CB and BC 227 5.17. Rank of partitioned matrices 228 5.18. Levinson-Durbin algorithm 229 5.18.1. AR process and Yule-Walker equations 230 5.18.2. Levinson-Durbin algorithm 232 5.18.3. Linear prediction 237 Chapter 6. Tensor Spaces and Tensors 243 6.1. Chapter summary 243 6.2. Hypermatrices 243 6.2.1. Hypermatrix vector spaces 244 6.2.2. Hypermatrix inner product and Frobenius norm 245 6.2.3. Contraction operation and n-mode hypermatrix-matrix product 245 6.3. Outer products 249 6.4. Multilinear forms, homogeneous polynomials and hypermatrices 251 6.4.1. Hypermatrix associated to a multilinear form 251 6.4.2. Symmetric multilinear forms and symmetric hypermatrices 252 6.5. Multilinear maps and homogeneous polynomials 255 6.6. Tensor spaces and tensors 255 6.6.1. Definitions 255 6.6.2. Multilinearity and associativity 257 6.6.3. Tensors and coordinate hypermatrices 257 6.6.4. Canonical writing of tensors 258 6.6.5. Expansion of the tensor product of N vectors 260 6.6.6. Properties of the tensor product 261 6.6.7. Change of basis formula 266 6.7. Tensor rank and tensor decompositions 268 6.7.1. Matrix rank 268 6.7.2. Hypermatrix rank 268 6.7.3. Symmetric rank of a hypermatrix 269 6.7.4. Comparative properties of hypermatrices and matrices 269 6.7.5. CPD and dimensionality reduction 271 6.7.6. Tensor rank 273 6.8. Eigenvalues and singular values of a hypermatrix 274 6.9. Isomorphisms of tensor spaces 276 References 281 Index 291

Gérard Favier is currently Emeritus Research Director at CNRS and I3S Laboratory, in Sophia Antipolis, France. His research interests include nonlinear system modeling and identification, signal processing applications, tensor models with associated algorithms for big data processing, and tensor approaches for MIMO communication systems.

See Also