Registered Member
|
Hi,
My objective function is written in Eigen. I haven't tried yet automatic differentiation in C++ (and in Matlab it was almost as slow as finite differences). I was wondering what's new on the subject (there seems to be an undocumented related class in the unsupported), and are there any recommendations (good AD library that works well with Eigen)? Zohar |
Moderator
|
The one in unsupported/Eigen/AutoDiff works well. Here is a self contained example combining it with the Levenberg-Marquart solver:
|
Registered Member
|
|
Registered Member
|
Just to have a better context, what exactly is the problem that you are solving?
Does it support Hessian for Newton method? Do you have a rough estimation of the AD performance vs. (manual) analytical solution vs. finite differences? I can follow the code, but is there a chance for a simpler cost function such as f(x) = x*x, or f(x) = x'*A*x (quadratic form for some constant matrix A, where x' is the transpose of x)? |
Moderator
|
Imagine a bar modeled as polyline with some stretching and bending forces, here modeled as penalties on the segment length differences and angle differences. One extremity is fixed, and the other is one is moved. (this is not a physical simulation, there is no mass, no gravity, no time, etc.).
You nest AD to compute 1st and 2nd order derivatives, and thus get the Hessian. From my experience, auto-diff was always faster than finite difference, but I guess it might depend on the problem. Manual differentiation is significantly faster because you can take advantage of higher level differentiation rules, remarkable identities, factorizations, etc. |
Registered Member
|
I forgot to ask: Does it support reverse mode? If not, any recommendations on something that does?
|
Moderator
|
No reverse mode. If I remember correctly Adol-C (http://www.coin-or.org/projects/ADOL-C.xml) has a reverse mode. There is a support file in unsupported/Eigen/AdolcSupport for the forward mode. I guess it cann easily be adapted to work with the tape based part of the lib that allows reverse mode.
|
Registered Member
|
|
Registered Member
|
Another question...
Is there a support for the sparsity pattern of the Jacobian? |
Moderator
|
I've no idea about Adolc. Regarding our AutoDiff, I remember it worked with sparse vectors for derivatives thus producing a sparse jacobian.
|
Registered Member
|
No, I didn't mean that it supports sparse matrices, I think that is a given. I meant it exploits the problem sparsity and avoids computation (seeding) of terms w.r.t. to some variables. Consider for example the minimization of the Dirichlet energy on a regular 2D grid (similar to your previous problem, but without the angles):
\sum_{i=1}^n\sum_{j=N(i)}\|v_j-v_i\|^2 where |N(i)|=4 are the number of neighbors of vertex i. The pseudo (matlab) code of the objective function is:
which takes O(n^2), while:
minimizing [t, Jg] = g(v) with a spare Jacobian pattern (each of the n terms in t is seeded only for 4 variables instead of n variables), then feeding it to a minimization of [vo, Jh] = h(t), and then: grad = Jh * Jg; takes O(n). |
Registered Member
|
Here, maybe example 4.1.1 on page 25 would be clearer:
http://www.google.co.il/url?sa=t&rct=j& ... RHWb5HcyDg |
Registered users: Bing [Bot], Google [Bot], Sogou [Bot]