Registered Member
|
I have a text file which consists of 907 objects and 1000 feature vector for each object. Is there any Library (Eigen) to calculate Principle Component Analysis(PCA)?
|
Moderator
|
PCA is essentially an eigenvalue problem, so assuming your data have been loaded into the matrix mat, you can compute the centered covariance matrix like this:
MatrixXd centered = mat.rowwise() - mat.colwise().mean(); MatrixXd cov = centered.adjoint() * centered; and then perform the eigendecomposition: SelfAdjointEigenSolver<MatrixXd> eig(cov); From eig, you have access to the sorted (increasing order) eigenvalues (eig.eigenvalues()) and respective eigenvectors (eig.eigenvectors()). For instance, eig.eigenvectors().rightCols(N) gives you the best N-dimension basis. |
Registered Member
|
Is there a method to calculate largest K eigenvalues and their corresponding eigenvectors?
The small eigenvalues and their eigenvectors are not useful in my application. |
Moderator
|
if your matrices are large and you need only a few eigenvalues/vectors you can use the ARPACK wrapper in unsupported/
|
Registered Member
|
I think you mean to apply the eigenvalue solve to the covariance matrix not the original matrix:
SelfAdjointEigenSolver<MatrixXd> eig(cov); |
Moderator
|
|
Registered Member
|
Hello,
i'm having the same problem figuring correct way to compute PCA. But instead of using EigenDecompostion i want to use SVD for PCA. the input is MxN Matrix X where M -- num observations N -- dimension of each observation according to definition after meaning the data:
please can anyone explain me the correct way of doing that things? |
Moderator
|
This look correct to me. For a single feature f do: f_projected = W.transpose() * f
|
Registered Member
|
Thanx for the answer.
to conclude it:
please fix me if i'm wrong with above thanx for your time. |
Moderator
|
Recall that A^T * B^T = (B*A)^T, so the last two lines are the same. To be precise, one returns a column vector while the other return a row vector, but operator= automatically transposes row to column vectors (and the other way round).
|
Registered Member
|
yup thats clear now!
thanks! |
Registered Member
|
Hi, I would like to know why you use the adjoint instead of the inverse? |
Registered users: bartoloni, Bing [Bot], Evergrowing, Google [Bot], ourcraft