In the mathematical discipline of linear algebra, eigendecomposition or sometimes spectral decomposition is the factorization of a matrix into a canonical form, whereby the matrix is represented in terms of its eigenvalues and eigenvectors.
If A
is symmetric, then A = V * D * V'
and A = V * V'
where the eigenvalue matrix D
is diagonal and the eigenvector matrix V
is orthogonal. If A
is not symmetric, the eigenvalue matrix D
is block diagonal with the real eigenvalues in 1-by-1 blocks and any complex eigenvalues, lambda + i*mu
, in 2-by-2 blocks, [lambda, mu; -mu, lambda]
. The columns of V
represent the eigenvectors in the sense that A * V = V * D
. The matrix V may be badly conditioned, or even singular, so the validity of the equation A = V * D * inverse(V)
depends upon the condition of V
.