GithubHelp home page GithubHelp logo

Comments (18)

yewalenikhil65 avatar yewalenikhil65 commented on June 7, 2024 1

Also if you get time, do take a look here https://github.com/yewalenikhil65/Application_of_CRNN/tree/main/CRNN_hockinMann
It doesnt work yet. Its highly stiff

How would you advise to check the the training of species (all 34 of them) ? push! pushes all of them into list_plt and is good for less number of species but not for too many species

:update--resolved..: I better thought to mkdirin figsfor each species

from crnn.

yewalenikhil65 avatar yewalenikhil65 commented on June 7, 2024 1

I think it's better to open in discussion section thread for multithreading and distributed computing in the code

from crnn.

jiweiqi avatar jiweiqi commented on June 7, 2024

You are right. Although it's not a big deal here since p and u0 are correctly set in silver.

By the way, for the PBE problem I don't think you need Robertson code, I suspect it is mainly the difficulty in inference nonlinear model parameters which can be improved by better designing experiments.

Or alternatively, don't expect that the model parameters should be same as the ground truth since there is no ground truth...

from crnn.

yewalenikhil65 avatar yewalenikhil65 commented on June 7, 2024

You are right. Although it's not a big deal here since p and u0 are correctly set in silver.

By the way, for the PBE problem I don't think you need Robertson code, I suspect it is mainly the difficulty in inference nonlinear model parameters which can be improved by better designing experiments.

Ohh no..i was trying this code again on another stiff system.. So happened to notice this robertson again.This stiff case I shall add today to same PBE repo in new folder.. Do take a look whenever you feel free..currently giving me headache becuase of ode not solved issue.. Tried increasing number of reactions to overparametrize CRNN but still no luck,

For PBE, for now I am currently satisfied with good fit even though weights haven't been learnt.

from crnn.

jiweiqi avatar jiweiqi commented on June 7, 2024

Got it! For this Robertson code, you can pass the tspan argument into solver directly without remake the problem and it will be the same.

What exactly the error is?

from crnn.

yewalenikhil65 avatar yewalenikhil65 commented on June 7, 2024

Got it! For this Robertson code, you can pass the tspan argument into solver directly without remake the problem and it will be the same.

What exactly the error is?

Its that Retcode returned by ODEProblem..Remade ODEProblem with updated params seem not to be solved.
println("ode solver failed")
It also arises sometimes in robertson code when its executed.. But its occasional in robertsons case.

from crnn.

jiweiqi avatar jiweiqi commented on June 7, 2024

What's your ode solver?
It happens for stiff problem. Try a good initialization and clip gradient initially to avoid bad parameters

from crnn.

yewalenikhil65 avatar yewalenikhil65 commented on June 7, 2024

What's your ode solver?
It happens for stiff problem. Try a good initialization and clip gradient initially to avoid bad parameters

AutoTsit5(Rosenbrock).. same as the one required for generating the data...
Will try now again with new initialisation

from crnn.

jiweiqi avatar jiweiqi commented on June 7, 2024

The solver looks good to me, although you could try pure stiff solver, like Rosenbrock23() by removing AutoTsit5

from crnn.

jiweiqi avatar jiweiqi commented on June 7, 2024

I have just updated the code a little bit a2 132c3e8 and you can have look. One thing to note is that Rosenbrock23(autodiff-false) was used due to historical reasons. and you can set autodiff=true now.

from crnn.

yewalenikhil65 avatar yewalenikhil65 commented on June 7, 2024

I have just updated the code a little bit a2 132c3e8 and you can have look. One thing to note is that Rosenbrock23(autodiff-false) was used due to historical reasons. and you can set autodiff=true now.

Yes, I will definetly look at this..
Also if you get time, do take a look here https://github.com/yewalenikhil65/Application_of_CRNN/tree/main/CRNN_hockinMann
It doesnt work yet. Its highly stiff

from crnn.

jiweiqi avatar jiweiqi commented on June 7, 2024

The hockin Mann problem looks interesting and I will also try it myself.
At a first glance, it is indeed stiff, so you might consider using the training strategies in Robertson's problem.
Pay attention to

function p2vec(p)
    slope = abs(p[end])
    w_b = @view(p[1:nr]) .* (10 * slope)


    w_in = reshape(@view(p[nr * (ns + 1) + 1:nr * (2 * ns + 1)]), ns, nr)


    w_out = reshape(@view(p[nr + 1:nr * (ns + 1)]), ns, nr)
    w_out = @. -w_in * (10 ^ w_out)


    w_in = clamp.(w_in, 0, 2.5)
    return w_in, w_b, w_out
end

from crnn.

jiweiqi avatar jiweiqi commented on June 7, 2024

I feel (10 ^ w_out) is useful to handle the scale separation in the rate constants.

from crnn.

yewalenikhil65 avatar yewalenikhil65 commented on June 7, 2024

The hockin Mann problem looks interesting and I will also try it myself.
At a first glance, it is indeed stiff, so you might consider using the training strategies in Robertson's problem.
Pay attention to

function p2vec(p)
    slope = abs(p[end])
    w_b = @view(p[1:nr]) .* (10 * slope)


    w_in = reshape(@view(p[nr * (ns + 1) + 1:nr * (2 * ns + 1)]), ns, nr)


    w_out = reshape(@view(p[nr + 1:nr * (ns + 1)]), ns, nr)
    w_out = @. -w_in * (10 ^ w_out)


    w_in = clamp.(w_in, 0, 2.5)
    return w_in, w_b, w_out
end

Yup.. I am tryring this..

from crnn.

yewalenikhil65 avatar yewalenikhil65 commented on June 7, 2024

pred = clamp.(Array(solve(prob, alg, u0=u0, p=p, sensalg=sense)), -ub, ub)

Here we try to clamp whole prediction to the upper bound. But for multiscale problems I think it makes more sense to have ub as a vector of upper bounds from each species(multiscaled) data, and then clamping the prediction of each species individually.. What do you think?

from crnn.

jiweiqi avatar jiweiqi commented on June 7, 2024

Sounds like a good idea. I haven't really played with ub a lot. In general, for the non-stiff problems, ub is not so necessary.

from crnn.

yewalenikhil65 avatar yewalenikhil65 commented on June 7, 2024

I have just updated the code a little bit a2 132c3e8 and you can have look. One thing to note is that Rosenbrock23(autodiff-false) was used due to historical reasons. and you can set autodiff=true now.

Yes, I will definetly look at this..
Also if you get time, do take a look here https://github.com/yewalenikhil65/Application_of_CRNN/tree/main/CRNN_hockinMann
It doesnt work yet. Its highly stiff

Trying that trick of ub with this problem. Going good, but very slow as there are too many(about 1500) paramaters.

Thought of parallelizing..Did you meant to use @distributed or @threads for the for loop of choosing random experiment among n_exp ? Parameters p are updated for each random choosing of experiments in currently serial code, I am unable to see how they'll be updated if we write parallel for loops

from crnn.

jiweiqi avatar jiweiqi commented on June 7, 2024

Yes, normally, we can not do that as we have to feed the optimizer one batch by one batch. I have an impression that some work is trying to do distributed training but I don't think it is widely accepted yet.

from crnn.

Related Issues (6)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.