Registered Member
|
Hi all,
So I've been trying to speed up some finite element code by moving to C++ from Matlab, and for small systems ( = small matrices) things work beautifully. However, once I move to bigger systems (sparse matrices of dimensions around 60,000 x 60,000), Eigen gives me `std::bad_alloc` during the `_symbolic()` part of the `sparseLDLT` decomposition. Matlab has no problems (aside from being slow) when dealing identical simulations. In Matlab, I solve the systems using `mldivide`, which according to Matlab docs is using CHOLMOD; I don't imagine that `sparseLDLT` is doing much different from matlab, so I can't work out what's wrong. Does anyone have any ideas why the simulation is breaking down? Chris. |
Moderator
|
Note that currently our LDLT does not do any reordering to reduce the fill-in, and so the result of the decomposition might be very dense. So I guess it crashes on the line:
m_matrix.resizeNonZeros(Lp[size]); where Lp[size] is the total number of non zeros needed to represent the factor L. Currently there is no built-in sparse decomposition with such reordering, so I recommend you to use our Cholmod backend, same API, simply declare the solver using: SparseLLT<YourMatrixType, Cholmod> Of course you need to have cholmod installed and to link to it and its dependencies and you need to #define EIGEN_CHOLMOD_SUPPORT before #including <Eigen/Sparse>. |
Registered Member
|
Thanks ggael, I seemed to be getting somewhere using the Cholmod backend (until I managed to introduce a segfault somewhere else in the code). I'll be coming back to this problem when I have some time in the future, but it looks like that's exactly what I needed.
|
Registered users: Bing [Bot], Google [Bot], Sogou [Bot]