Registered Member
|
Hello, I have a program that operates on big matrices (millions of elements of doubles).
The program is written that way that no "new" matrices are created on the way. All the matrices are members of a specific class (prv. members). I declare all the matrices in the "private:" part of my class definition like this class blah_blah { public: private: Eigen::Matrix<double,6*N,1> chi_vector; Eigen::Matrix<double,6*N,N> chi_vector_2 .... Eigen::Matrix<double,6*N*N,N> chi_vector_m }; for example, where N is const defined at compile time. If I use a big N, I get std::bad alloc some times. What would be the best programming paradigm for allocating my big Eigen matrices once? Should the be members of a class? like above ? Do you have any example of sources that allocate big matrices without problems? |
Moderator
|
You should not try to allocate big matrices on the stack. I recommend you to use MatrixXd and VectorXd, and that you initialize their sizes in your class constructor.
|
Registered Member
|
I have also tried to initialize the matrices in the constructor with MatrixXd and resize
This way the constructor takes too much time (That's fine since it happens once) but again I get bad_alloc errors. Is there any other paradigm? |
Moderator
|
if you get bad_alloc, this mean you run out of memory, so you should check your to see if you really need all these matrices allocated at once...
|
Registered users: Bing [Bot], Google [Bot], Yahoo [Bot]