GithubHelp home page GithubHelp logo

briochemc / f1method.jl Goto Github PK

View Code? Open in Web Editor NEW
4.0 2.0 0.0 514 KB

F-1 method

License: MIT License

Julia 100.00%
julia optimization solver iterative-methods dual-numbers hyperdual-numbers autodifferentiation sensitivity-analysis modeling optimizer

f1method.jl's Introduction

logo

F-1 algorithm

License: MIT

This package implements the F-1 algorithm described in Pasquier and Primeau (in preparation). It allows for efficient quasi-auto-differentiation of an objective function defined implicitly by the solution of a steady-state problem.

Consider a discretized system of nonlinear partial differential equations that takes the form

F(x,p) = 0

where x is a column vector of the model state variables and p is a vector of parameters. The F-1 algorithm then allows for an efficient computation of both the gradient vector and the Hessian matrix of a generic objective function defined by

objective(p) = f(s(p),p)

where s(p) is the steady-state solution of the system, i.e., such that F(s(p),p) = 0 and where f(x,p) is for example a measure of the mismatch between observed state, parameters, and observations. Optimizing the model is then simply done by minimizing objective(p). (See Pasquier and Primeau (in preparation), for more details.)

Advantages of the F-1 algorithm

The F-1 algorithm is easy to use, gives accurate results, and is computationally fast:

  • Easy — The F-1 algorithm basically just needs the user to provide a solver (for finding the steady-state), the mismatch function, f, an ODEFunction, F with its Jacobian, and the gradient of the objective w.r.t. ∇ₓf. (Note these derivatives can be computed numerically, via the ForwardDiff package for example.)
  • Accurate — Thanks to ForwardDiff's nested dual numbers implementation, the accuracy of the gradient and Hessian, as computed by the F-1 algorithm, are close to machine precision.
  • Fast — The F-1 algorithm is as fast as if you derived analytical formulas for every first and second derivatives and used those in the most efficient way. This is because the bottleneck of such computations is the number of matrix factorizations, and the F-1 algorithm only requires a single one. In comparison, standard autodifferentiation methods that take the steady-state solver as a black box would require order m or m^2 factorizations, where m is the number of parameters.

What's needed?

A requirement of the F-1 algorithm is that the Jacobian matrix A = ∇ₓF can be created, stored, and factorized.

To use the F-1 algorithm, the user must:

  • Make sure that there is a suitable algorithm alg to solve the steady-state equation
  • overload the solve function and the SteadyStateProblem constructor from SciMLBase. (An example is given in the CI tests — see, e.g., the test/simple_setup.jl file.)
  • Provide the derivatives of f and F with respect to the state, x.

A concrete example

Make sure you have overloaded solve from SciMLBase (an example of how to do this is given in the documentation). Once initial values for the state, x, and parameters, p, are chosen, simply initialize the required memory cache, mem via

# Initialize the cache for storing reusable objects
mem = initialize_mem(F, ∇ₓf, x, p, alg; options...)

wrap the functions into functions of p only via

# Wrap the objective, gradient, and Hessian functions
objective(p) = F1Method.objective(f, F, mem, p, alg; options...)
gradient(p) = F1Method.gradient(f, F, ∇ₓf, mem, p, alg; options...)
hessian(p) = F1Method.hessian(f, F, ∇ₓf, mem, p, alg; options...)

and compute the objective, gradient, or Hessian via either of

objective(p)

gradient(p)

hessian(p)

That's it. You were told it was simple, weren't you? Now you can test how fast and accurate it is!

Citing the software

If you use this package, or implement your own package based on the F-1 algorithm please cite us. If you use the F-1 algorithm, please cite Pasquier and Primeau (in prep.). If you also use this package directly, please cite it! (Use the Zenodo link or the CITATION.bib file, which contains a bibtex entry.)

Future

This package is developed mainly for use with AIBECS.jl and is likely not in its final form. The API was just changed in v0.5 (to match the API changes in AIBECS.jl v0.11). That being said, ultimately, it would make sense for the shortcuts used here to be integrated into a package like ChainRules.jl. For the time being, AIBECS users can use F1Method.jl to speed up their optimizations.

f1method.jl's People

Contributors

briochemc avatar juliatagbot avatar

Stargazers

 avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

f1method.jl's Issues

Make it faster by turning off iterative refinements

Maybe this should be turned off by default for this method? I am unsure how to deal with the non-sparse case, but this is clearly useful in the case where the Jacobian is sparse, and can be done via

SuiteSparse.UMFPACK.umf_ctrl[8] = 0 # turn iterative refinements off

See issue #31105
Maybe add this in the docs/readme?

Change from DiffEqBase to SciMLBase

DiffEqBase is the lowest common denominator for the DiffEq packages, not necessarily the whole SciML ecosystem, and so it has a lot DiffEq dependencies. These are generally not required by downstream packages. If what you're looking for is a way to define problems without having most dependencies, we recommend you use SciMLBase as the dependency since everything like ODEProblem, SteadyStateProblem, etc. is defined there. We basically recommend depending on SciMLBase for problem definitions, and solver packages for specific solvers, but generally most non-SciML packages should not be depending on DiffEqBase directly (given the split of SciMLBase in 2021)

For more details see: https://diffeq.sciml.ai/stable/features/low_dep/ and https://discourse.julialang.org/t/psa-the-right-dependency-to-reduce-from-differentialequations-jl/72757

Let me know if you need any help updating this, though for almost all dependents here it should be a trivial name change as you're actually using pieces from SciMLBase.

TagBot trigger issue

This issue is used to trigger TagBot; feel free to unsubscribe.

If you haven't already, you should update your TagBot.yml to include issue comment triggers.
Please see this post on Discourse for instructions and more details.

If you'd like for me to do this for you, comment TagBot fix on this issue.
I'll open a PR within a few hours, please be patient!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.