GithubHelp home page GithubHelp logo

insysbio / likelihoodprofiler.jl Goto Github PK

View Code? Open in Web Editor NEW
17.0 5.0 3.0 1.03 MB

LikelihoodProfiler is a Julia package for practical identifiability analysis and confidence intervals evaluation.

Home Page: https://insysbio.github.io/LikelihoodProfiler.jl/latest

License: MIT License

Julia 100.00%
julia confidence-intervals likelihood systems-biology identifiability identifiability-analysis

likelihoodprofiler.jl's Introduction

LikelihoodProfiler

Documentation Build status Coverage Status GitHub release GitHub license DOI:10.1371/journal.pcbi.1008495

LikelihoodProfiler is a Julia language package for identifiability analysis and confidence intervals estimation.

See documentation.

Use cases

Notebooks with use cases can be found in a separate repository: https://github.com/insysbio/likelihoodprofiler-cases

Case Ref
PK model with saturation in elimination Binder
Local optim methods comparison Binder
TGF-β Signaling Pathway Model Binder
SIR Model. A simple model used as an exercise in identifiability analysis. Binder
Cancer Taxol Treatment Model Binder
STAT5 Dimerization Model Binder

Installation

julia> ]

(v1.7) pkg> add LikelihoodProfiler

Quick start

using LikelihoodProfiler

# testing profile function
f(x) = 5.0 + (x[1]-3.0)^2 + (x[1]-x[2]-1.0)^2 + 0*x[3]^2

# Calculate parameters intervals for first parameter component, x[1]
res_1 = get_interval(
  [3., 2., 2.1], # starting point
  1,             # parameter component to analyze
  f,             # profile function
  :LIN_EXTRAPOL; # method
  loss_crit = 9. # critical level of loss function
  )
#

# Plot parameter profile x[1]
using Plots
plotly()
plot(res_1)

Plot Linear

License

MIT Public license

How to cite

Borisov I, Metelkin E (2020) Confidence intervals by constrained optimization—An algorithm and software package for practical identifiability analysis in systems biology. PLoS Comput Biol 16(12): e1008495.

Ref: https://doi.org/10.1371/journal.pcbi.1008495

likelihoodprofiler.jl's People

Contributors

ivborissov avatar metelkin avatar pitmonticone avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

likelihoodprofiler.jl's Issues

Interpretting `Warning: Close-to-zero parameters found` warning

When I try to computer intervals, I sometimes get these warnings:

 ^t^l Warning: Close-to-zero parameters found when using :LN_NELDERMEAD.
 ^t^t @ LikelihoodProfiler ~/.julia/packages/LikelihoodProfiler/Qi97K/src/method_lin_extrapol.jl:34
 ^t^l Warning: Close-to-zero parameters found when using :LN_NELDERMEAD.
 ^t^t @ LikelihoodProfiler ~/.julia/packages/LikelihoodProfiler/Qi97K/src/method_lin_extrapol.jl:34
 ^t^l Warning: Close-to-zero parameters found when using :LN_NELDERMEAD.
 ^t^t @ LikelihoodProfiler ~/.julia/packages/LikelihoodProfiler/Qi97K/src/method_quadr_extrapol.jl:35
 ^t^l Warning: Close-to-zero parameters found when using :LN_NELDERMEAD.
 ^t^t @ LikelihoodProfiler ~/.julia/packages/LikelihoodProfiler/Qi97K/src/method_quadr_extrapol.jl:35
 ^t^l Warning: Close-to-zero parameters found when using :LN_NELDERMEAD.
 ^t^t @ LikelihoodProfiler ~/.julia/packages/LikelihoodProfiler/Qi97K/src/cico_one_pass.jl:32
 ^t^l Warning: Close-to-zero parameters found when using :LN_NELDERMEAD.
 ^t^t @ LikelihoodProfiler ~/.julia/packages/LikelihoodProfiler/Qi97K/src/cico_one_pass.jl:32
 ^t^l Warning: Close-to-zero parameters found when using :LN_NELDERMEAD.

Here I compute the same interval using all methods. Is it that the warning happens for each method, and then in both directions?

Exactly what does this warning mean, and how worried should I be about it?

Failure the find correct interval (for `:CICO_ONE_PASS`, `:LIN_EXTRAPOL` suceeds)

I have a case where I wish to compute the likelihood profile interval. In this case I have done it manually (by brute forcing) just for comparison. Here is the full profile, and the threshold value plotted out:
image
(the full range is like (1e-3,1e8))

Here, I should get an interval from somewhere around 1e4 to 1e8 (where we hit the boundary).

I first try using the :CICO_ONE_PASS method, but it seems to fail:

conf_int_1 = get_interval(start_p, p_idx, f, :CICO_ONE_PASS; loss_crit, theta_bounds, scan_bounds)

gives:
image
image
Here, it claims to have found the left boundary, but it clearly did not. Also, it gives a ParamIntervl of [8.0, >8,0], which I do not know how to interpret.

Using :LIN_EXTRAPOL it works fine:

conf_int_2 = get_interval(start_p, p_idx, f, :LIN_EXTRAPOL; loss_crit, theta_bounds, scan_bounds)

image
image

Why did the first one fail? Is there a good way to reliably solve this, or at least figure out that it failed? I am scanning a large number of intervals for a model, and want to compute how the size of the interval depends on the data's quality. This means while for a singular case I can look at the plot and see what is going on, for my application I cannot. I would need to figure out how to automate this. I have OK amount of compute, so I could try and compute the itnerval using all methods, but am unsure how to figure out that one method failed.

I could try to provide more detail on the cost function (but it will mean lots of code, I create the cost function using PEtab, and several files with data)

Recommended approach for utilising gradients

I have created an optimisation problem using PEtab.jl. This gives me a gradient, which I have tried to adapt to LiklihoodProfiler's format using

function loss_grad(p)
    grad = zeros(9)
    opt_prob2_3.compute_gradient!(grad, p)
    return grad
end

My impression is that, by default, this gradient is not utilised. What combination of (profiling) method and local algorithm do you recommend for utilising gradients properly?

What kind of advantage can I expect to get if I have a gradient?

Extend parameter identifiability analysis to nonlinear systems of ODEs

An immediate reaction to the publication of the documentation for LikelihooProfiler is to consider from the very beginning nonlinear systems of ODEs as done in Matlab here :
https://bmcsystbiol.biomedcentral.com/articles/10.1186/s12918-017-0428-y

LikelihoodProfiler would be an excellent pure Julia version of the method in this reference and the algortithms involved are fully described.

Concerning NLopt.jl, it is a good wrapper for NLopt in C. Today, a lot of this can be done in pure Julia with Optim or SparseDiffTools since you are already using JuliaDiffEq.

Find loss value on end point when bound limit is reached

So, if you look at the output of the get_interval function, you can actually find the exact parameter points (and loss function values) where the interval was reached (in profilePoints). However, this does not seem to be possible when the edge of the scan boundary was reached?

E.g. for something like this (profile in blue, critical loss value in red)
image
We do not find a value on the left, but just note that the scan boundary was reached.

Is there still a way to recover the best value of the loss function here (essentially the end point of the blue line)?

My reasoning is that it is not always trivial to know what critical value to use. Here, I would first run one scal (with loss_crit = Inf) to find the maximum values at the boundaries (if these are maximum values, but in my case I think so). Next, I could use this (and the loss value at the starting point) to select some good loss_crit for a follow up scan.

`Dual{ForwardDiff.Tag{LikelihoodProfiler...` occurs in `supreme` field of LiklihoodProfile endpoint structures.

When performing large-scale scans I noticed something odd:
In the EndPoint structure, the supreme field sometimes seems to be a complicated AD structure (an example is provided at the end). Is this something that I should expect?

When this happens, it also always seems to hold:
value is nothing
status is :LOSS_ERROR_STOP

If you are very interested, I could try to provide more information (but a bit hard as I saw this when checking the result of late scans). For now, two questions:

  1. Is this something that I should expect, or something odd?
  2. Can I just interpret this as that LiklihoodProfiler could not finish, and just set the endpoint to the bound?

I should also mention that this happened while doing gradient-based computation of itnervals.

Dual{ForwardDiff.Tag{LikelihoodProfiler.var"#loss_func_gd#10"{Float64, Vector{Symbol}, Int64, PEtab.var"#451#452"{SciMLBase.ODEProblem{Vector{Float64}, Tuple{Float64, Float64}, true, Vector{Float64}, SciMLBase.ODEFunction{true, SciMLBase.AutoSpecialize, ModelingToolkit.var"#k#503"{RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:ˍ₋arg1, :ˍ₋arg2, :t), ModelingToolkit.var"#_RGF_ModTag", ModelingToolkit.var"#_RGF_ModTag", (0x71a72f97, 0x8bd4d31f, 0xcbd41a4c, 0x6f395a82, 0xe4e46fa4), Nothing}, RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:ˍ₋out, :ˍ₋arg1, :ˍ₋arg2, :t), ModelingToolkit.var"#_RGF_ModTag", ModelingToolkit.var"#_RGF_ModTag", (0x2315ea45, 0x6c498c5d, 0x9477b69b, 0x9121e3c5, 0xb42ff303), Nothing}}, LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, ModelingToolkit.var"#___jac#509"{RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:ˍ₋arg1, :ˍ₋arg2, :t), ModelingToolkit.var"#_RGF_ModTag", ModelingToolkit.var"#_RGF_ModTag", (0x74caff8d, 0x766c7ae9, 0x3504f839, 0xf47c720c, 0xfcdbfdf1), Nothing}, RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:ˍ₋out, :ˍ₋arg1, :ˍ₋arg2, :t), ModelingToolkit.var"#_RGF_ModTag", ModelingToolkit.var"#_RGF_ModTag", (0xa09e3c33, 0xbba65094, 0x7e85f24c, 0xa202f0bf, 0x36c01e12), Nothing}}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, ModelingToolkit.var"#678#generated_observed#513"{Bool, ModelingToolkit.ODESystem, Dict{Any, Any}, Vector{SymbolicUtils.BasicSymbolic{Real}}}, Nothing, ModelingToolkit.ODESystem, Nothing, Nothing}, Base.Pairs{Symbol, Union{}, Tuple{}, @NamedTuple{}}, SciMLBase.StandardODEProblem}, ODESolver, SteadyStateSolver{Nothing, Float64, SciMLBase.NonlinearProblem{Vector{Float64}, true, Vector{Float64}, SciMLBase.NonlinearFunction{true, SciMLBase.AutoSpecialize, SciMLBase.var"#41#50"{SciMLBase.ODEFunction{true, SciMLBase.AutoSpecialize, ModelingToolkit.var"#k#503"{RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:ˍ₋arg1, :ˍ₋arg2, :t), ModelingToolkit.var"#_RGF_ModTag", ModelingToolkit.var"#_RGF_ModTag", (0x71a72f97, 0x8bd4d31f, 0xcbd41a4c, 0x6f395a82, 0xe4e46fa4), Nothing}, RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:ˍ₋out, :ˍ₋arg1, :ˍ₋arg2, :t), ModelingToolkit.var"#_RGF_ModTag", ModelingToolkit.var"#_RGF_ModTag", (0x2315ea45, 0x6c498c5d, 0x9477b69b, 0x9121e3c5, 0xb42ff303), Nothing}}, LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, ModelingToolkit.var"#___jac#509"{RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:ˍ₋arg1, :ˍ₋arg2, :t), ModelingToolkit.var"#_RGF_ModTag", ModelingToolkit.var"#_RGF_ModTag", (0x74caff8d, 0x766c7ae9, 0x3504f839, 0xf47c720c, 0xfcdbfdf1), Nothing}, RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:ˍ₋out, :ˍ₋arg1, :ˍ₋arg2, :t), ModelingToolkit.var"#_RGF_ModTag", ModelingToolkit.var"#_RGF_ModTag", (0xa09e3c33, 0xbba65094, 0x7e85f24c, 0xa202f0bf, 0x36c01e12), Nothing}}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, ModelingToolkit.var"#678#generated_observed#513"{Bool, ModelingToolkit.ODESystem, Dict{Any, Any}, Vector{SymbolicUtils.BasicSymbolic{Real}}}, Nothing, ModelingToolkit.ODESystem, Nothing, Nothing}}, LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, SciMLBase.var"#44#53"{SciMLBase.ODEFunction{true, SciMLBase.AutoSpecialize, ModelingToolkit.var"#k#503"{RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:ˍ₋arg1, :ˍ₋arg2, :t), ModelingToolkit.var"#_RGF_ModTag", ModelingToolkit.var"#_RGF_ModTag", (0x71a72f97, 0x8bd4d31f, 0xcbd41a4c, 0x6f395a82, 0xe4e46fa4), Nothing}, RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:ˍ₋out, :ˍ₋arg1, :ˍ₋arg2, :t), ModelingToolkit.var"#_RGF_ModTag", ModelingToolkit.var"#_RGF_ModTag", (0x2315ea45, 0x6c498c5d, 0x9477b69b, 0x9121e3c5, 0xb42ff303), Nothing}}, LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, ModelingToolkit.var"#___jac#509"{RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:ˍ₋arg1, :ˍ₋arg2, :t), ModelingToolkit.var"#_RGF_ModTag", ModelingToolkit.var"#_RGF_ModTag", (0x74caff8d, 0x766c7ae9, 0x3504f839, 0xf47c720c, 0xfcdbfdf1), Nothing}, RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:ˍ₋out, :ˍ₋arg1, :ˍ₋arg2, :t), ModelingToolkit.var"#_RGF_ModTag", ModelingToolkit.var"#_RGF_ModTag", (0xa09e3c33, 0xbba65094, 0x7e85f24c, 0xa202f0bf, 0x36c01e12), Nothing}}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, ModelingToolkit.var"#678#generated_observed#513"{Bool, ModelingToolkit.ODESystem, Dict{Any, Any}, Vector{SymbolicUtils.BasicSymbolic{Real}}}, Nothing, ModelingToolkit.ODESystem, Nothing, Nothing}}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, ModelingToolkit.var"#678#generated_observed#513"{Bool, ModelingToolkit.ODESystem, Dict{Any, Any}, Vector{SymbolicUtils.BasicSymbolic{Real}}}, Nothing, ModelingToolkit.ODESystem, Nothing}, Base.Pairs{Symbol, Union{}, Tuple{}, @NamedTuple{}}, SciMLBase.StandardNonlinearProblem}, SciMLBase.DiscreteCallback{PEtab.var"#147#149"{SciMLBase.ODEProblem{Vector{Float64}, Tuple{Float64, Float64}, true, Vector{Float64}, SciMLBase.ODEFunction{true, SciMLBase.AutoSpecialize, ModelingToolkit.var"#k#503"{RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:ˍ₋arg1, :ˍ₋arg2, :t), ModelingToolkit.var"#_RGF_ModTag", ModelingToolkit.var"#_RGF_ModTag", (0x71a72f97, 0x8bd4d31f, 0xcbd41a4c, 0x6f395a82, 0xe4e46fa4), Nothing}, RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:ˍ₋out, :ˍ₋arg1, :ˍ₋arg2, :t), ModelingToolkit.var"#_RGF_ModTag", ModelingToolkit.var"#_RGF_ModTag", (0x2315ea45, 0x6c498c5d, 0x9477b69b, 0x9121e3c5, 0xb42ff303), Nothing}}, LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, ModelingToolkit.var"#___jac#509"{RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:ˍ₋arg1, :ˍ₋arg2, :t), ModelingToolkit.var"#_RGF_ModTag", ModelingToolkit.var"#_RGF_ModTag", (0x74caff8d, 0x766c7ae9, 0x3504f839, 0xf47c720c, 0xfcdbfdf1), Nothing}, RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:ˍ₋out, :ˍ₋arg1, :ˍ₋arg2, :t), ModelingToolkit.var"#_RGF_ModTag", ModelingToolkit.var"#_RGF_ModTag", (0xa09e3c33, 0xbba65094, 0x7e85f24c, 0xa202f0bf, 0x36c01e12), Nothing}}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, ModelingToolkit.var"#678#generated_observed#513"{Bool, ModelingToolkit.ODESystem, Dict{Any, Any}, Vector{SymbolicUtils.BasicSymbolic{Real}}}, Nothing, ModelingToolkit.ODESystem, Nothing, Nothing}, Base.Pairs{Symbol, Union{}, Tuple{}, @NamedTuple{}}, SciMLBase.StandardODEProblem}, Float64, Float64}, typeof(PEtab.affect_terminate_ss!), typeof(SciMLBase.INITIALIZE_DEFAULT), typeof(SciMLBase.FINALIZE_DEFAULT)}, Nothing}, PEtabModel{RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:u, :t, :p_ode_problem, :θ_observable, :θ_non_dynamic, :parameter_info, :observableId, :parameter_map), PEtab.var"#_RGF_ModTag", PEtab.var"#_RGF_ModTag", (0x92ddf7b3, 0x1cc9d19a, 0xbaec34fa, 0x257e8eaa, 0x79dbcf2e), Expr}, RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:u0, :p_ode_problem), PEtab.var"#_RGF_ModTag", PEtab.var"#_RGF_ModTag", (0x1fbd0b86, 0x669ed9a0, 0x06dcf6bf, 0x6962dd3e, 0x2e0539d4), Expr}, RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:p_ode_problem,), PEtab.var"#_RGF_ModTag", PEtab.var"#_RGF_ModTag", (0x3b8b1c22, 0x1e4120d9, 0x8fd958f4, 0x1c5d0f13, 0x6c3b08a5), Expr}, RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:u, :t, :θ_sd, :p_ode_problem, :θ_non_dynamic, :parameter_info, :observableId, :parameter_map), PEtab.var"#_RGF_ModTag", PEtab.var"#_RGF_ModTag", (0x48465b1e, 0x68218d68, 0xac3a4da9, 0x92089d5b, 0x52cfe146), Expr}, RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:u, :t, :p_ode_problem, :θ_observable, :θ_non_dynamic, :observableId, :parameter_map, :out), PEtab.var"#_RGF_ModTag", PEtab.var"#_RGF_ModTag", (0x494595d1, 0x02271b05, 0x3e025731, 0xb617498f, 0xfaf52b5d), Expr}, RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:u, :t, :θ_sd, :p_ode_problem, :θ_non_dynamic, :parameter_info, :observableId, :parameter_map, :out), PEtab.var"#_RGF_ModTag", PEtab.var"#_RGF_ModTag", (0x5a3178c2, 0xe3713556, 0x0b94466a, 0x84d4af82, 0xb7e750aa), Expr}, RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:u, :t, :p_ode_problem, :θ_observable, :θ_non_dynamic, :observableId, :parameter_map, :out), PEtab.var"#_RGF_ModTag", PEtab.var"#_RGF_ModTag", (0x7b77bf93, 0x84e249eb, 0xc39ad80f, 0x80a52264, 0x8ae56d66), Expr}, RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:u, :t, :θ_sd, :p_ode_problem, :θ_non_dynamic, :parameter_info, :observableId, :parameter_map, :out), PEtab.var"#_RGF_ModTag", PEtab.var"#_RGF_ModTag", (0x7b77bf93, 0x84e249eb, 0xc39ad80f, 0x80a52264, 0x8ae56d66), Expr}, RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:u, :p), PEtab.var"#_RGF_ModTag", PEtab.var"#_RGF_ModTag", (0x5fc3bbc6, 0x4bb50cdc, 0x0bd616e0, 0x875f0bec, 0xbc417956), Expr}, SciMLBase.CallbackSet{Tuple{}, Tuple{}}, Vector{Function}, Catalyst.ReactionSystem{Catalyst.NetworkProperties{Int64, SymbolicUtils.BasicSymbolic{Real}}}}, PEtab.SimulationInfo{Dict{Symbol, SciMLBase.DECallback}, Dict{Symbol, SciMLBase.DECallback}}, PEtab.ParameterIndices, PEtab.MeasurementsInfo{Vector{Union{Float64, String}}}, PEtab.ParametersInfo, PEtab.PriorInfo, PEtab.PEtabODEProblemCache{Vector{Float64}, PreallocationTools.DiffCache{Vector{Float64}, Vector{Float64}}, Vector{Float64}, Matrix{Float64}}, PEtab.PEtabODESolverCache, Bool}, ProgressMeter.ProgressUnknown, Bool}, Float64}}(8.0001,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0)

Error when `scan_bouns` and `theta_bounds` are identical

I am trying to perform likelihood profiling for a problem from parameter fitting to data. Here, I know bounds for all parameters, and want to incorporate this both into theta_bounds and scan_bounds. Turns out that if scan_bounds is identical to the bounds for that parameter in theta_bounds you get an error. Wouldn't it be natural to allow this?

In practice this is solvable by making scan_bounds ever so slightly narrower on both sides (and maybe this can be done internally if thus turns out to be the case).

Just a thought, that this might be a nice improvement to make.

Minimal example:

using LikelihoodProfiler

# testing profile function
g(x) = 5.0 + (x[1]-3.0)^2 + (x[1]-x[2]-1.0)^2 + 0*x[3]^2

# Calculate parameters intervals for first parameter component, x[1]
res_1 = get_interval(
  [3., 2., 2.1], # starting point
  1,             # parameter component to analyze
  g,             # profile function
  :LIN_EXTRAPOL; # method
  loss_crit = 9., # critical level of loss function
  theta_bounds = [(0.0,10.0), (0.0,10.0), (0.0,10.0)],
  scan_bounds = (0.0,10.0)
  )
#

gives:

ERROR: ArgumentError: scan_bound are outside of the theta_bounds (0.0, 10.0)
Stacktrace:
 [1] get_endpoint(theta_init::Vector{…}, theta_num::Int64, loss_func::typeof(g), method::Symbol, direction::Symbol; loss_crit::Float64, scale::Vector{…}, theta_bounds::Vector{…}, scan_bound::Float64, scan_tol::Float64, loss_tol::Float64, local_alg::Symbol, silent::Bool, kwargs::@Kwargs{})
   @ LikelihoodProfiler ~/.julia/packages/LikelihoodProfiler/Qi97K/src/get_endpoint.jl:145
 [2] get_endpoint
   @ ~/.julia/packages/LikelihoodProfiler/Qi97K/src/get_endpoint.jl:112 [inlined]
 [3] #27
   @ ./none:0 [inlined]
 [4] iterate(g::Base.Generator, s::Vararg{Any})
   @ Base ./generator.jl:47 [inlined]
 [5] collect(itr::Base.Generator{UnitRange{…}, LikelihoodProfiler.var"#27#29"{…}})
   @ Base ./array.jl:834
 [6] get_interval(theta_init::Vector{…}, theta_num::Int64, loss_func::typeof(g), method::Symbol; loss_crit::Float64, scale::Vector{…}, theta_bounds::Vector{…}, scan_bounds::Tuple{…}, scan_tol::Float64, loss_tol::Float64, local_alg::Symbol, kwargs::@Kwargs{})
   @ LikelihoodProfiler ~/.julia/packages/LikelihoodProfiler/Qi97K/src/get_interval.jl:119
 [7] top-level scope
   @ ~/Desktop/Julia Playground/Catalyst playground/Envirionment - PEtab_profile_liklihood_test/profile_test_playground.jl:157
Some type information was truncated. Use `show(err)` to see complete types.a NaN value in the state, parameter

Liklihood profile does not computed all the way to identifiability level.

I am fitting a set of ODEs to data, minimizing a cost function to do so. After I have found a parameter set I want to use profile likelihood analysis to evaluate practical identifiability. However, I do not seem to get the complete evaluation?

# Getting these functions is quite convoluted, but in principle it is just the fitting of and ODE to data using l2 distance.
cost_func(p) # My cost function.
opt_parameterset The parameter set that minimises the cost function.
opt_val # The minimal value of the cost function

using LikelihoodProfiler

@time liklihood_profile = get_interval(
    opt_parameterset,                 # starting point
    1,                                                      # parameter component to analyze
    cost_func,                                              # profile function
    :LIN_EXTRAPOL;                                          # method
    loss_crit = 10*opt_val                     # critical level of loss function
)
plot(liklihood_profile)

image

The evaluation of the likelihood profile only takes like 20 second as well, so it feel it could have evaluate more of it?

Is there a way to extend the evaluation?

TagBot trigger issue

This issue is used to trigger TagBot; feel free to unsubscribe.

If you haven't already, you should update your TagBot.yml to include issue comment triggers.
Please see this post on Discourse for instructions and more details.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.