yat
0.20.3pre
|
Principal Component Analysis on a Kernel Matrix. More...
#include <yat/utility/KernelPCA.h>
Public Member Functions | |
KernelPCA (const MatrixBase &kernel) | |
virtual | ~KernelPCA (void) |
destructor | |
const Vector & | eigenvalues (void) const |
sorted eigenvalues. More... | |
const Matrix & | projection (void) const |
Principal Component Analysis on a Kernel Matrix.
This class performs PCA on a kernel matrix. Note that this class does not diagonalize the kernel matrix to find eigen-samples that maximizes the variance (use class SVD). Instead, this class finds eigen-features that maximizes the variance in feature space and projects the data onto these eigen-features.
As the covariance of features is not available nor is the data, we create a data matrix Z that fulfills Kernel = Z' * Z. As this data matrix Z has the same kernel matrix as the original data and thus also same distances between each pair of sample, the difference between Z and original data matrix is at most a rotation and translation. Hence, the projection of Z onto the first principial components will be equivalent to the projection of original data onto its principal components.
|
explicit |
Constructor taking the kernel matrix as input. kernel is expected to be symmetric and positive semi-definite.
The kernel matrix contains the scalar product between all samples, i.e., element kernel(i,j) is the scalar product between sample i and sample j.
const Vector& theplu::yat::utility::KernelPCA::eigenvalues | ( | void | ) | const |
sorted eigenvalues.
const Matrix& theplu::yat::utility::KernelPCA::projection | ( | void | ) | const |
This function will project data onto the new coordinate-system.
Each column corresponds to a sample.