GithubHelp home page GithubHelp logo

rasimers / optim.jl Goto Github PK

View Code? Open in Web Editor NEW

This project forked from julianlsolvers/optim.jl

1.0 0.0 0.0 5.64 MB

Optimization functions for Julia

License: Other

Julia 98.31% TeX 1.64% Makefile 0.05%

optim.jl's Introduction

Optim.jl

Build Status Codecov branch JOSS

Univariate and multivariate optimization in Julia.

Optim.jl is part of the JuliaNLSolvers family.

Help and support

For help and support, please post on the Optimization (Mathematical) section of the Julia discourse or the #math-optimization channel of the Julia slack.

Installation

Install Optim.jl using the Julia package manager:

import Pkg
Pkg.add("Optim")

Documentation

The online documentation is available at https://julianlsolvers.github.io/Optim.jl/stable.

Example

To minimize the Rosenbrock function, do:

julia> using Optim

julia> rosenbrock(x) =  (1.0 - x[1])^2 + 100.0 * (x[2] - x[1]^2)^2
rosenbrock (generic function with 1 method)

julia> result = optimize(rosenbrock, zeros(2), BFGS())
 * Status: success

 * Candidate solution
    Final objective value:     5.471433e-17

 * Found with
    Algorithm:     BFGS

 * Convergence measures
    |x - x'|               = 3.47e-07  0.0e+00
    |x - x'|/|x'|          = 3.47e-07  0.0e+00
    |f(x) - f(x')|         = 6.59e-14  0.0e+00
    |f(x) - f(x')|/|f(x')| = 1.20e+03  0.0e+00
    |g(x)|                 = 2.33e-09  1.0e-08

 * Work counters
    Seconds run:   0  (vs limit Inf)
    Iterations:    16
    f(x) calls:    53
    ∇f(x) calls:   53

julia> Optim.minimizer(result)
2-element Vector{Float64}:
 0.9999999926033423
 0.9999999852005355

julia> Optim.minimum(result)
5.471432670590216e-17

To get information on the keywords used to construct method instances, use the Julia REPL help prompt (?)

help?> LBFGS
search: LBFGS

  LBFGS
  ≡≡≡≡≡

  Constructor
  ===========

  LBFGS(; m::Integer = 10,
  alphaguess = LineSearches.InitialStatic(),
  linesearch = LineSearches.HagerZhang(),
  P=nothing,
  precondprep = (P, x) -> nothing,
  manifold = Flat(),
  scaleinvH0::Bool = true && (typeof(P) <: Nothing))

  LBFGS has two special keywords; the memory length m, and the scaleinvH0 flag.
  The memory length determines how many previous Hessian approximations to
  store. When scaleinvH0 == true, then the initial guess in the two-loop
  recursion to approximate the inverse Hessian is the scaled identity, as can be
  found in Nocedal and Wright (2nd edition) (sec. 7.2).

  In addition, LBFGS supports preconditioning via the P and precondprep keywords.

  Description
  ===========

  The LBFGS method implements the limited-memory BFGS algorithm as described in
  Nocedal and Wright (sec. 7.2, 2006) and original paper by Liu & Nocedal
  (1989). It is a quasi-Newton method that updates an approximation to the
  Hessian using past approximations as well as the gradient.

  References
  ==========

    •  Wright, S. J. and J. Nocedal (2006), Numerical optimization, 2nd edition.
       Springer

    •  Liu, D. C. and Nocedal, J. (1989). "On the Limited Memory Method for
       Large Scale Optimization". Mathematical Programming B. 45 (3): 503528

Use with JuMP

You can use Optim.jl with JuMP.jl as follows:

julia> using JuMP, Optim

julia> model = Model(Optim.Optimizer);

julia> set_optimizer_attribute(model, "method", BFGS())

julia> @variable(model, x[1:2]);

julia> @objective(model, Min, (1.0 - x[1])^2 + 100.0 * (x[2] - x[1]^2)^2)
(x[1- 2 x[1] + 1) + (100.0 * ((-x[1+ x[2]) ^ 2.0))

julia> optimize!(model)

julia> objective_value(model)
3.7218241804173566e-21

julia> value.(x)
2-element Vector{Float64}:
 0.9999999999373603
 0.99999999986862

Citation

If you use Optim.jl in your work, please cite the following:

@article{mogensen2018optim,
  author  = {Mogensen, Patrick Kofod and Riseth, Asbj{\o}rn Nilsen},
  title   = {Optim: A mathematical optimization package for {Julia}},
  journal = {Journal of Open Source Software},
  year    = {2018},
  volume  = {3},
  number  = {24},
  pages   = {615},
  doi     = {10.21105/joss.00615}
}

optim.jl's People

Contributors

pkofod avatar johnmyleswhite avatar anriseth avatar timholy avatar mlubin avatar evizero avatar github-actions[bot] avatar andreasnoack avatar antoine-levitt avatar cortner avatar blegat avatar blakejohnson avatar chrisrackauckas avatar yuyichao avatar kristofferc avatar lindahua avatar arnostrouwen avatar gwater avatar tkf avatar briochemc avatar femtocleaner[bot] avatar tcovert avatar rsrock avatar mcreel avatar cossio avatar jeff-regier avatar benkuhn avatar alyst avatar ahwillia avatar bicycle1885 avatar

Stargazers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.