/// <summary>
 /// Computes the eigenvalues and eigenvectors of the matrix.
 /// </summary>
 /// <returns>A representation of the eigenvalues and eigenvectors of the matrix.</returns>
 /// <remarks>
 /// <para>For a generic vector v and matrix M, Mv = u will point in some direction with no particular relationship to v.
 /// The eigenvectors of a matrix M are vectors z that satisfy Mz = &#x3BB;z, i.e. multiplying an eigenvector by a
 /// matrix reproduces the same vector, up to a prortionality constant &#x3BB; called the eigenvalue.</para>
 /// <para>For v to be an eigenvector of M with eigenvalue &#x3BB;, (M - &#x3BB;I)z = 0. But for a matrix to
 /// anihilate any non-zero vector, that matrix must have determinant, so det(M - &#x3BB;I)=0. For a matrix of
 /// order N, this is an equation for the roots of a polynomial of order N. Since an order-N polynomial always has exactly
 /// N roots, an order-N matrix always has exactly N eigenvalues.</para>
 /// <para>An alternative way of expressing the same relationship is to say that the eigenvalues of a matrix are its
 /// diagonal elements when the matrix is expressed in a basis that diagonalizes it. That is, given Z such that Z<sup>-1</sup>MZ = D,
 /// where D is diagonal, the columns of Z are the eigenvectors of M and the diagonal elements of D are the eigenvalues.</para>
 /// <para>Note that the eigenvectors of a matrix are not entirely unique. Given an eigenvector z, any scaled vector &#x3B1;z
 /// is an eigenvector with the same eigenvalue, so eigenvectors are at most unique up to a rescaling. If an eigenvalue
 /// is degenerate, i.e. there are two or more linearly independent eigenvectors with the same eigenvalue, then any linear
 /// combination of the eigenvectors is also an eigenvector with that eigenvalue, and in fact any set of vectors that span the
 /// same subspace could be taken as the eigenvector set corresponding to that eigenvalue.</para>
 /// <para>The eigenvectors of a symmetric matrix are always orthogonal and the eigenvalues are always real. The transformation
 /// matrix Z is thus orthogonal (Z<sup>-1</sup> = Z<sup>T</sup>).</para>
 /// <para>Finding the eigenvalues and eigenvectors of a symmetric matrix is an O(N<sup>3</sup>) operation.</para>
 /// <para>If you require only the eigenvalues, not the eigenvectors, of the matrix, the <see cref="Eigenvalues"/> method
 /// will produce them faster than this method.</para>
 /// </remarks>
 public RealEigensystem Eigensystem()
 {
     double[][] A = SymmetricMatrixAlgorithms.Copy(values, dimension);
     double[]   V = SquareMatrixAlgorithms.CreateUnitMatrix(dimension);
     SymmetricMatrixAlgorithms.JacobiEigensystem(A, V, dimension);
     return(new RealEigensystem(dimension, SymmetricMatrixAlgorithms.GetDiagonal(A, dimension), V));
 }
 /// <summary>
 /// Computes the eigenvalues of the matrix.
 /// </summary>
 /// <returns>An array containing the matrix eigenvalues.</returns>
 /// <remarks>
 /// <para>If you require only the eigenvalues of the matrix, not its eigenvectors, this method will return them faster than
 /// the <see cref="Eigensystem"/> method. If you do need the eigenvectors as well as the eigenvalues, use the <see cref="Eigensystem"/>
 /// method instead.</para>
 /// </remarks>
 public double[] Eigenvalues()
 {
     double[][] A = SymmetricMatrixAlgorithms.Copy(values, dimension);
     SymmetricMatrixAlgorithms.JacobiEigensystem(A, null, dimension);
     return(SymmetricMatrixAlgorithms.GetDiagonal(A, dimension));
 }