This forum has been archived. All content is frozen. Please use KDE Discuss instead.

Memory issues when solving large linear systems

Tags: None
(comma "," separated)
surle
Registered Member
Posts
7
Karma
0
I'm having some memory issues when solving large linear systems. I can setup the structures ok,
but after solve() is called, eigen library crashes on me. It normally only takes a couple
of seconds to return, so I'm guessing it's running out of memory. The code works ok with smaller systems.

Example of my code:
Code: Select all
// fill A,b
Eigen::SparseMatrix<double> A;
Eigen::VectorXd b;
...

// solve Ax = b
Eigen::CholmodSupernodalLLT< Eigen::SparseMatrix<double> > solver( A );
Eigen::VectorXd x = solver.solve( b );      // crash here


Matrices can be 100,000 x 100,000 or larger.

Note that I specifically have to use a 32-bit program with limited memory space.
So, I can't simply solve my memory issues by going to 64-bit or by installing more RAM.

In the Windows Task Manager, I can monitor the memory usage of my program.
It consumes about 1GB of memory before it crashes.

- Is there anything I can do to reduce my memory usage? (other than using floats over doubles)
- Can I restrict eigen library from trying to allocate more memory than I have?
- Can large systems like this be broken down into smaller systems that use less memory?
- How much more memory is needed by eigen after solve() is called?
- Can I estimate how much memory eigen library will use based on my system?

It really doesn't matter if the process takes more time. I just need to know how to solve
a large linear system given a limited amount of memory.

Using latest Eigen library, 3.2.1.
MS Visual Studio C++ 2010.

Thanks
User avatar
ggael
Moderator
Posts
3447
Karma
19
OS
Could you at least report the backtrace. If it's crashing into Cholmod, then there is nothing we can do. However, if this is really a memory issue, then you could fallback to an iterative solver, i.e., ConjugateGradient.
surle
Registered Member
Posts
7
Karma
0
Thanks for the reply. I have one more question.

I'm assuming all of these libraries, like Cholmod and SuperLU, operate on sparse matrices with data
in a compressed format. They never have to decompress them. So, if happen to create a large 50k x 50k sparse matrix,
and fill it, should my memory usage increase greatly when attempting to solve it?

Because, while watching my program, I might see my memory usage suddenly jump from 100KB to 1GB, after calling solve().
Anything that goes over 1GB is potentially fatal for me.

Is this normal, or could it mean somewhere my matrices are becoming decompressed in memory?

Last edited by surle on Mon Apr 14, 2014 12:35 pm, edited 1 time in total.
User avatar
ggael
Moderator
Posts
3447
Karma
19
OS
Indeed, direct methods as implemented in Cholmod or SuperLU (or Eigen::SimplicialLDLT, Eigen::SparseLU, etc.) might require a lot of memory because the factors are much more dense, especially if the input matrix is not so sparse. This is typically the case of problems discretized on a 3D domain for which the memory easily increases by a factor 10. On the other hand, the overhead of iterative methods is either negligible or can be bounded. That's why I recommended the ConjugateGradient solver.


Bookmarks



Who is online

Registered users: Baidu [Spider], Bing [Bot], Google [Bot], rblackwell