May 06, 2006

Matrix cheat-sheet

[Math_] Normal matrices, spectral theorem, decompositions etc.

In the discussion below:
Q,U,V: unitary matrices (QQ*=I)
S: a diagonal matrix
R: an upper triangular matrix
X: a square matrix

A = any mxn matrix
=> A = USV* (SVD decomposition)
=> A = QR (general QR-decomp with Q mxn unitary i.e. QQ*=I)

A = any nxn square matrix
=> A = QRQ* (Schur decomposition)
=> A = QR (QR decomposition)
??? What is the relation between the two?

A = diagonalizable square matrix
=> A = XSX* (Eigen or spectral decomposition)

A = normal matrix (i.e. AA* = A*A)
=> A = QSQ* (Eigenvectors orthogonal - Spectral Theorem)

subclasses of normal matrices:
A = hermitian (i.e. A*=A, means symmetric if real)
A = skew-hermitian (i.e. A*=-A, skew-symmetric if real)
A = unitary (i.e. AA*=I, orthogonal if real)

subclasses of hermitian matrices:
A = positive semidefinite (i.e. all eigenvalues nonnegative)
=> SVD and eigen decompositions are the same: singular values need to be non-negative.
A = positive definite (i.e. all eigenvalues positive)

(1) Excerpt from

These results translate immediately into results about matrices: For any normal matrix A, there exists a unitary matrix U such that

A=U \Lambda U*

where \Lambda is the diagonal matrix the entries of which are the eigenvalues of A. Furthermore, any matrix which can be diagonalized in this way must be normal.

The column vectors of U are the eigenvectors of A and they are orthogonal.

The spectral decomposition is a special case of the Schur decomposition. It is also a special case of the singular value decomposition.

If A is a real symmetric matrix, it follows by the real version of the spectral theorem for symmetric operators that there is an orthogonal matrix U such that UAU* is diagonal and all the eigenvalues of A are real.

(2) Excerpt from

Decomposition theorems for general matrices

The decomposition theorem is a version of the spectral theorem in the particular case of matrices. This theorem is usually introduced in terms of coordinate transformation. If U is an invertible matrix, it can be seen as a transformation from one coordinate system to another, with the columns of U being the components of the new basis vectors within the old basis set. In this new system the coordinates of the vector v are labeled v'. The latter are obtained from the coordinates v in the original coordinate system by the relation v' = Uv and, the other way around, we have v = U − 1v'. Applying successively v' = Uv, w' = Uw and U − 1U = I, to the relation Av = w defining the matrix multiplication provides A'v' = w' with A' = UAU − 1, the representation of A in the new basis. In this situation, the matrices A and A' are said to be similar.

The decomposition theorem states that, if one chooses as columns of U − 1 n linearly independent eigenvectors of A, the new matrix A' = UAU − 1 is diagonal and its diagonal elements are the eigenvalues of A. If this is possible the matrix A is diagonalizable. An example of non-diagonalizable matrix is given by the matrix A above. There are several generalizations of this decomposition which can cope with the non-diagonalizable case, suited for different purposes:

* the Schur triangular form states that any matrix is unitarily equivalent to an upper triangular one;
* the singular value decomposition, A = UΣV * where Σ is diagonal with U and V unitary matrices. The diagonal entries of A = UΣV * are nonnegative; they are called the singular values of A. This can be done for non-square matrices as well;
* the Jordan normal form, where A = XΛX − 1 where Λ is not diagonal but block-diagonal. The number and the sizes of the Jordan blocks are dictated by the geometric and algebraic multiplicities of the eigenvalues. The Jordan decomposition is a fundamental result. One might glean from it immediately that a square matrix is described completely by its eigenvalues, including multiplicity, up to similarity. This shows mathematically the important role played by eigenvalues in the study of matrices;
* as an immediate consequence of Jordan decomposition, any matrix A can be written uniquely as A = S + N where S is diagonalizable, N is nilpotent (i.e., such that Nq=0 for some q), and S commutes with N (SN=NS).

Related link

1 comment:

Anonymous said...

very helpful. thanks