Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extending autodiff compatibilities #132

Open
raphaelchinchilla opened this issue Sep 21, 2020 · 4 comments
Open

Extending autodiff compatibilities #132

raphaelchinchilla opened this issue Sep 21, 2020 · 4 comments

Comments

@raphaelchinchilla
Copy link

I was considering extending the autodiff compatibilities of NLSolverBase and PR the result back. I wanted to include ReverseDiff.jl, Zygote.jl and maybe also forward-over-reverse for the Hessian.

Is there any reason why this has not been done yet that I should be aware before starting? I don't want to implement it just to discover that there is a reason why it has not been done.

Also, is there a rationale for the fact that, for twicedifferentiable.jl when the gradient is provided, the hessian is calculated using the jacobian of the gradient for the finitediff case, but it calculated using the hessian of the function for forwardiff (lines 63 and 69)?

@longemen3000
Copy link
Contributor

The internal implementation of ForwardDiff's
hessian is the Jacobian of the gradient too

@longemen3000
Copy link
Contributor

I advise against integrating new functionality, as optim is being rewritten (NLSolvers.jl)

@pkofod
Copy link
Member

pkofod commented Dec 17, 2020

Is there any reason why this has not been done yet that I should be aware before starting? I don't want to implement it just to discover that there is a reason why it has not been done.

No one has needed it or asked for it. We did support ReverseDiff but for a long time that was broken so I took it out.

Also, is there a rationale for the fact that, for twicedifferentiable.jl when the gradient is provided, the hessian is calculated using the jacobian of the gradient for the finitediff case, but it calculated using the hessian of the function for forwardiff (lines 63 and 69)?

I guess we could use the jacobian directly If the gradient was available.

I advise against integrating new functionality, as optim is being rewritten (NLSolvers.jl)

Yes, but if you want to help out, I'm not at a point where I'm adding AD to NLSolvers.jl

@raphaelchinchilla
Copy link
Author

Since I considered extending the autodiff compatibilities, I came across with SciML/GalacticOptim.jl which has already included all of those I was planning to, so I am not sure anymore whether there would be a gain in including it in NLSolvers.jl.

Alternatively, one could try to use what was done in GalacticOptim.jl in the upstream packages such as NLSolvers

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants