This forum has been archived. All content is frozen. Please use KDE Discuss instead.

Thin QR decomposition?

Tags: None
(comma "," separated)
tholy
Registered Member
Posts
9
Karma
0
OS

Thin QR decomposition?

Fri Nov 05, 2010 10:40 pm
First, since I'm new to both Eigen and this forum let me begin by saying that I'm very impressed with Eigen. Great performance and a great API are (in my experience) surprisingly difficult to achieve simultaneously, yet you've done it. Thanks for making this terrific tool available.

I have the following problem. I'm interested in "orthonormalizing" a set of N vectors in d dimensions; I have to do this over and over again, for different vectors (and even different N), so I'd like to do it efficiently. I presume that QR decomposition is the right tool. (An alternative is: compute the covariance matrix, compute its Cholesky decomposition, and then calculate the conceptual equivalent of R.inv()*x. But this becomes unpleasant if the covariance matrix is ill-conditioned, which happens often enough to be worrisome, particularly if N is not much bigger than d.)

I wrote the following test program:

#include <iostream>
#include <Eigen/QR>

using namespace std;
using namespace Eigen;

int main()
{
MatrixXf A = MatrixXf::Random(15,2);
ColPivHouseholderQR<MatrixXf> qrdecomp(A);
MatrixXf Q = qrdecomp.householderQ();
cout << "Q:\n" << Q << endl;
}

This program generates 15 data points in 2 dimensions, and then orthonormalizes them. However, the orthonormalized output Q is a 15-by-15 matrix. For my purposes, I'm only interested in the first two columns (otherwise known as the "thin QR decomposition"), and indeed those columns are the only ones that are unique because of zeros in R. Since I sometimes do this for N = 10000 (or more) data points, it would seem those extra columns would eat up an awful lot of unnecessary storage (and presumably computation time).

Is there a way to get the thin decomposition? I have read the documentation relatively carefully (I think), and also searched, and have so far failed to come up with anything. Hence I am troubling the members of this forum (with apologies) for their guidance.

I am using Eigen 3.0 from the hg repository.

Thanks again!
Andre
Registered Member
Posts
90
Karma
1

Re: Thin QR decomposition?

Sat Nov 06, 2010 1:13 pm
You can use GramSchmidt to do exactly that. I don't think its build in, but its very easy. As far as I remember it's not as accurate as the householder though.


'And all those exclamation marks, you notice? Five? A sure sign of someone who wears his underpants on his head.' ~Terry Pratchett

'It's funny. All you have to do is say something nobody understands and they'll do practically anything you want them to.' ~J.D. Salinger
tholy
Registered Member
Posts
9
Karma
0
OS

Re: Thin QR decomposition?

Sat Nov 06, 2010 1:47 pm
Thanks for the suggestion. To clarify (sorry, I should have explained better the first time), I often have more "vectors" (data points) than dimensions. This is "orthonormalizing" in the sense of transforming a cloud of data points so that their covariance matrix is equal to the identity, not orthonormalizing in the sense of finding a basis set of orthonormal vectors. And as you say, even for that latter task Gram-Schmidt would not be as accurate as QR.

The Cholesky approach will work as long as the covariance matrix is well-conditioned, which it often is when the number of points greatly exceeds the dimensionality. It's those corner cases I worry about. There are ways to estimate a robust covariance matrix [e.g., Ledoit & Wolf, "A well-conditioned estimator for large-dimensional covariance matrices", J. Multivariate Analysis 88: 365-411 (2004)], and I can use such methods if necessary.
tholy
Registered Member
Posts
9
Karma
0
OS

Re: Thin QR decomposition?

Sat Nov 06, 2010 3:29 pm
Ah, upon further reflection I agree that Gram-Schmidt would work (swap what one considers to be coordinates and point indices, i.e., take the transpose). I still have a bit of reluctance to use it because of its reputation for low accuracy, but I agree that it's worth considering in the absence of a Householder-based alternative.
User avatar
ggael
Moderator
Posts
3447
Karma
19
OS

Re: Thin QR decomposition?

Sat Nov 06, 2010 10:35 pm
I cannot check right now but I think that adding a thin QR is just a matter of adding a thinMatrixQ() function returning a HouseholderSequence object with appropriate size... and you can even do this yourself, outside Eigen, by constructing it from the matrixQR() object. See the implementation of our matrixQ() function...

EDIT: what about the following:

HouseholderQR<MatrixXd> qr(A);
MatrixXd thinQ = householderSequence(
qr.matrixQR().leftCols(rank),
qr.hCoeffs().head(rank));

(not tested, copied/pasted from the solve function)
Andre
Registered Member
Posts
90
Karma
1

Re: Thin QR decomposition?

Sun Nov 07, 2010 11:29 am
Well, if performance is most important, modified Gram Schmidt should be the way to go. It's not as if the results would be extremely bad.

Straight out of Golub, it would look like this:
Code: Select all
    MatrixXf A = MatrixXf::Random(10000,20);
    int n = A.cols();
    int m = A.rows();
    MatrixXf R_1(n,n);
    MatrixXf Q_1(m,n);
    for (int i = 0; i < n; ++i) {
        R_1(i,i) = A.col(i).norm();
        Q_1.col(i) = A.col(i) / R_1(i,i);
        for (int j = i+1; j < n; ++j) {
            R_1(i,j) = Q_1.col(i).transpose() * A.col(j);
            A.col(j) = A.col(j) - (Q_1.col(i)*R_1(i,j));
        }
    }


'And all those exclamation marks, you notice? Five? A sure sign of someone who wears his underpants on his head.' ~Terry Pratchett

'It's funny. All you have to do is say something nobody understands and they'll do practically anything you want them to.' ~J.D. Salinger
tino
Registered Member
Posts
3
Karma
0

Re: Thin QR decomposition?

Tue Oct 25, 2011 2:46 pm
Hi,

I have just started using Eigen and am very happy with this great library.
However, I ran into the same problem as tholy. Namely, I need to calculate
the QR decomposition of tall thin matrix -- for instance (5000, 50) entries.

I tried the thinQ code suggested by ggael:

ggael wrote:
HouseholderQR<MatrixXd> qr(A);
MatrixXd thinQ = householderSequence(
qr.matrixQR().leftCols(rank),
qr.hCoeffs().head(rank));



However, this returns a square Q matrix.
Has there been any effort to include a thin decomposition?

Thanks,
Valentin
User avatar
ggael
Moderator
Posts
3447
Karma
19
OS

Re: Thin QR decomposition?

Tue Oct 25, 2011 9:03 pm
hm, right. The QR is inherently thin. We are only missing an elegant way to assign a sequence of Householder reflectors to a "thin" matrix. The following should do the job:

typedef HouseholderQR<MatrixXd> qr(A);
MatrixXd thinQ;
thinQ.setIdentity(rows, cols);
qr.householderQ().applyThisOnTheLeft(thinQ);
tino
Registered Member
Posts
3
Karma
0

Re: Thin QR decomposition?

Wed Oct 26, 2011 6:14 am
Thanks, ggael. That works perfectly.

It might be worth to include this function in the library, since it is not an uncommon task to calculate the thin decomposition.
User avatar
ggael
Moderator
Posts
3447
Karma
19
OS

Re: Thin QR decomposition?

Wed Oct 26, 2011 6:52 am
what's your use case? If what you only need is to multiply Q or it's "thin inverse" then you don't need to explicitly form Q,you can directly apply the sequence of Householder reflectors that represent Q.
tino
Registered Member
Posts
3
Karma
0

Re: Thin QR decomposition?

Wed Oct 26, 2011 7:09 am
I do need to calculate Q. I am calculating the Lyapunov exponents and Lyapunov vectors of a large set of ODEs. The vectors are the columns of Q. Perhaps it is not so common to explicitly calculate Q, after all. Anyway, I am happy with the current solution. Thanks.


Bookmarks



Who is online

Registered users: Bing [Bot], claydoh, Google [Bot], Yahoo [Bot]