public void EigenvalueDecompositionConstructorTest2() { // Asymmetric test double[][] A = { new double[] { 5, 2, 1 }, new double[] { 1, 4, 1 }, new double[] { -1, 2, 3 } }; var target = new JaggedEigenvalueDecomposition(A); var D = target.DiagonalMatrix; var Q = target.Eigenvectors; double[][] expectedD = { new double[] { 6, 0, 0 }, new double[] { 0, 4, 0 }, new double[] { 0, 0, 2 } }; // Decomposition identity var actualA = Matrix.Multiply(Matrix.Multiply(Q, D), Q.Inverse()); Assert.IsTrue(Matrix.IsEqual(expectedD, D, 1e-5)); Assert.IsTrue(Matrix.IsEqual(A, actualA, 1e-5)); Assert.IsTrue(Matrix.IsEqual(A, target.Reverse(), 1e-5)); }
public void SolveForEigen(double[][] matrix) { if (matrix.Rows() != matrix.Columns()) { throw new ArgumentException("Must be quadratic"); } JaggedEigenvalueDecomposition evd = new JaggedEigenvalueDecomposition(matrix, true, false, true); ComponentVectors = evd.Eigenvectors.Transpose(); Eigenvalues = evd.RealEigenvalues; }
public void InverseTestNaN() { int n = 5; var I = Matrix.Identity(n); for (int i = 0; i < n; i++) { for (int j = 0; j < n; j++) { double[][] value = Matrix.JaggedMagic(n); value[i][j] = double.NaN; var target = new JaggedEigenvalueDecomposition(value); } } }
public void EigenvalueDecompositionConstructorTest() { // Symmetric test double[][] A = { new double[] { 4, 2 }, new double[] { 2, 4 } }; var target = new JaggedEigenvalueDecomposition(A); var D = target.DiagonalMatrix; var Q = target.Eigenvectors; double[][] expectedD = { new double[] { 2, 0 }, new double[] { 0, 6 } }; double[][] expectedQ = { new double[] { 0.7071, 0.7071 }, new double[] { -0.7071, 0.7071 } }; Assert.IsTrue(Matrix.IsEqual(expectedD, D, 0.00001)); Assert.IsTrue(Matrix.IsEqual(expectedQ, Q, 0.0001)); // Decomposition identity var actualA = Matrix.Dot(Matrix.Dot(Q, D), Q.Inverse()); Assert.IsTrue(Matrix.IsEqual(expectedD, D, 0.00001)); Assert.IsTrue(Matrix.IsEqual(A, actualA, 0.0001)); Assert.AreSame(target.DiagonalMatrix, target.DiagonalMatrix); }
/// <summary> /// Learns a model that can map the given inputs to the desired outputs. /// </summary> /// <param name="x">The model inputs.</param> /// <param name="weights">The weight of importance for each input sample.</param> /// <returns> /// A model that has learned how to produce suitable outputs /// given the input data <paramref name="x" />. /// </returns> public MultivariateKernelRegression Learn(double[][] x, double[] weights = null) { this.sourceCentered = null; this.StandardDeviations = null; this.featureMean = null; this.featureGrandMean = 0; double[][] K; if (Method == PrincipalComponentMethod.KernelMatrix) { K = x; if (centerFeatureSpace) // Center the Gram (Kernel) Matrix if requested { K = Accord.Statistics.Kernels.Kernel.Center(K, out featureMean, out featureGrandMean); // do not overwrite } } else { this.NumberOfInputs = x.Columns(); this.Means = x.Mean(dimension: 0); this.sourceCentered = Overwrite ? x : Jagged.CreateAs(x); x.Subtract(Means, dimension: 0, result: sourceCentered); if (Method == PrincipalComponentMethod.Standardize) { this.StandardDeviations = x.StandardDeviation(Means); sourceCentered.Divide(StandardDeviations, dimension: 0, result: sourceCentered); } // Create the Gram (Kernel) Matrix K = kernel.ToJagged(x: sourceCentered); if (centerFeatureSpace) // Center the Gram (Kernel) Matrix if requested { K = Accord.Statistics.Kernels.Kernel.Center(K, out featureMean, out featureGrandMean, result: K); // overwrite } } // Perform the Eigenvalue Decomposition (EVD) of the Kernel matrix var evd = new JaggedEigenvalueDecomposition(K, assumeSymmetric: true, sort: true); // Gets the Eigenvalues and corresponding Eigenvectors int numberOfSamples = x.Length; double[] evals = evd.RealEigenvalues; double[][] eigs = evd.Eigenvectors; int nonzero = evd.Rank; if (NumberOfInputs != 0) { nonzero = Math.Min(nonzero, NumberOfInputs); } if (NumberOfOutputs != 0) { nonzero = Math.Min(nonzero, NumberOfOutputs); } // Eliminate unwanted components eigs = eigs.Get(null, 0, nonzero); evals = evals.Get(0, nonzero); // Normalize eigenvectors if (centerFeatureSpace) { eigs.Divide(evals.Sqrt(), dimension: 0, result: eigs); } if (Whiten) { eigs.Divide(evals.Sqrt(), dimension: 0, result: eigs); } //this.Eigenvalues = evals.Divide(numberOfSamples - 1); this.Eigenvalues = evals; this.SingularValues = evals.Divide(numberOfSamples - 1).Sqrt(); this.ComponentVectors = eigs.Transpose(); if (allowReversion) { // Project the original data into principal component space this.result = Matrix.Dot(K, eigs).ToMatrix(); } // Computes additional information about the analysis and creates the // object-oriented structure to hold the principal components found. CreateComponents(); Accord.Diagnostics.Debug.Assert(NumberOfOutputs > 0); return(CreateRegression()); }
/// <summary> /// Learns a model that can map the given inputs to the desired outputs. /// </summary> /// /// <param name="x">The model inputs.</param> /// <param name="weights">The weight of importance for each input sample.</param> /// /// <returns> /// A model that has learned how to produce suitable outputs /// given the input data <paramref name="x" />. /// </returns> /// public MultivariateLinearRegression Learn(double[][] x, double[] weights = null) { this.NumberOfInputs = x.Columns(); if (Method == PrincipalComponentMethod.Center || Method == PrincipalComponentMethod.Standardize) { this.Means = x.Mean(dimension: 0); double[][] matrix = Overwrite ? x : Jagged.CreateAs(x); x.Subtract(Means, dimension: 0, result: matrix); if (Method == PrincipalComponentMethod.Standardize) { this.StandardDeviations = x.StandardDeviation(Means); matrix.Divide(StandardDeviations, dimension: 0, result: matrix); } // The principal components of 'Source' are the eigenvectors of Cov(Source). Thus if we // calculate the SVD of 'matrix' (which is Source standardized), the columns of matrix V // (right side of SVD) will be the principal components of Source. // Perform the Singular Value Decomposition (SVD) of the matrix var svd = new JaggedSingularValueDecomposition(matrix, computeLeftSingularVectors: false, computeRightSingularVectors: true, autoTranspose: true, inPlace: true); SingularValues = svd.Diagonal; Eigenvalues = SingularValues.Pow(2); Eigenvalues.Divide(x.Rows() - 1, result: Eigenvalues); ComponentVectors = svd.RightSingularVectors.Transpose(); } else if (Method == PrincipalComponentMethod.CovarianceMatrix || Method == PrincipalComponentMethod.CorrelationMatrix) { // We only have the covariance matrix. Compute the Eigenvalue decomposition var evd = new JaggedEigenvalueDecomposition(x, assumeSymmetric: true, sort: true); // Gets the Eigenvalues and corresponding Eigenvectors Eigenvalues = evd.RealEigenvalues; SingularValues = Eigenvalues.Sqrt(); ComponentVectors = evd.Eigenvectors.Transpose(); } else { // The method type should have been validated before we even entered this section throw new InvalidOperationException("Invalid method, this should never happen: {0}".Format(Method)); } if (Whiten) { ComponentVectors.Divide(SingularValues, dimension: 1, result: ComponentVectors); } // Computes additional information about the analysis and creates the // object-oriented structure to hold the principal components found. CreateComponents(); return(CreateRegression()); }
public MultivariateLinearRegression Learn(double[][] x, double[] weights = null) { this.NumberOfInputs = x.Columns(); if (Method == PrincipalComponentMethod.Center || Method == PrincipalComponentMethod.Standardize) { if (weights == null) { this.Means = x.Mean(dimension: 0); double[][] matrix = Overwrite ? x : Jagged.CreateAs(x); x.Subtract(Means, dimension: (VectorType)0, result: matrix); if (Method == PrincipalComponentMethod.Standardize) { this.StandardDeviations = x.StandardDeviation(Means); matrix.Divide(StandardDeviations, dimension: (VectorType)0, result: matrix); } var svd = new JaggedSingularValueDecomposition(matrix, computeLeftSingularVectors: false, computeRightSingularVectors: true, autoTranspose: true, inPlace: true); SingularValues = svd.Diagonal; Eigenvalues = SingularValues.Pow(2); Eigenvalues.Divide(x.Rows() - 1, result: Eigenvalues); ComponentVectors = svd.RightSingularVectors.Transpose(); } else { this.Means = x.WeightedMean(weights: weights); double[][] matrix = Overwrite ? x : Jagged.CreateAs(x); x.Subtract(Means, dimension: (VectorType)0, result: matrix); if (Method == PrincipalComponentMethod.Standardize) { this.StandardDeviations = x.WeightedStandardDeviation(weights, Means); matrix.Divide(StandardDeviations, dimension: (VectorType)0, result: matrix); } double[,] cov = x.WeightedCovariance(weights, Means); var evd = new EigenvalueDecomposition(cov, assumeSymmetric: true, sort: true); Eigenvalues = evd.RealEigenvalues; SingularValues = Eigenvalues.Sqrt(); ComponentVectors = Jagged.Transpose(evd.Eigenvectors); } } else if (Method == PrincipalComponentMethod.CovarianceMatrix || Method == PrincipalComponentMethod.CorrelationMatrix) { if (weights != null) { throw new Exception(); } var evd = new JaggedEigenvalueDecomposition(x, assumeSymmetric: true, sort: true); Eigenvalues = evd.RealEigenvalues; SingularValues = Eigenvalues.Sqrt(); ComponentVectors = evd.Eigenvectors.Transpose(); } else { throw new InvalidOperationException("Invalid method, this should never happen: {0}".Format(Method)); } if (Whiten) { ComponentVectors.Divide(SingularValues, dimension: (VectorType)1, result: ComponentVectors); } CreateComponents(); return(CreateRegression()); }