GithubHelp home page GithubHelp logo

Optimally training ensembles. about equinox HOT 4 OPEN

BabaYara avatar BabaYara commented on June 9, 2024
Optimally training ensembles.

from equinox.

Comments (4)

BabaYara avatar BabaYara commented on June 9, 2024

I am currently doing something along these lines:

def v_pred(model, X): 
    return jax.vmap(evaluate_ensemble, (None, 0))(model, X)  

def loss_fn30(model, X, Y):    
    pred = jnp.squeeze(v_pred(model, X)) 
    loss = pred - jnp.expand_dims(Y, 1)  
    l6    = jnp.square(loss)      
    wt0    = jnp.where(l6 < 2e-16, 0.0, 1.0) 
    return jnp.mean(wt0*l6) 

LR_rate = 1e-7
init_learning_rate = jnp.array(LR_rate)  
opt1        = optax.inject_hyperparams(optax.adamw)(learning_rate=LR_rate, weight_decay=5.0)   
clipping    = optax.inject_hyperparams(optax.clip_by_global_norm)(max_norm = 1e-4)            
optimizer1  = optax.chain(clipping, opt1)                       function
opt_state1  = optimizer1.init(eqx.filter(mlp_ensemble, eqx.is_array))                                        # Chain the optimizer with the clipping function


@eqx.filter_jit
def train_per_ensemble(model, x, y, state):
    vals, grads = eqx.filter_value_and_grad(loss_fn30)(model, x, y)
    updates, state = optimizer1.update(grads, state, model)
    model = eqx.apply_updates(model, updates)
    return model, state, vals

from equinox.

lockwo avatar lockwo commented on June 9, 2024
  1. if you add python to after the "```" you can get nice coloring on the code, like this:
print("equinox is great")
  1. a cursory glance of your code and it seems ok, but maybe there is a more specific question here? Is something going wrong with the code? In general, I treat ensembles of models the same way. Its just a pytree and I deal with the differences in the loss function usually

from equinox.

BabaYara avatar BabaYara commented on June 9, 2024

First question will be in relation to the loss. I felt like I was forcing the models to converge with the way my loss is structured. So I would appreciate any tips on training with the loss such that it treats models differently.

The second has to do with some optimizations I was able to make with single model training speed ups that I cannot emulate here .
I was using the scan function to optimize the training loop as below but cant seem to use the same idea with ensembles.

ars, sts = eqx.partition(model, eqx.is_array)
arrs, uf = flatten_util.ravel_pytree(ars)
opt_state1  = optimizer1.init(arrs) 
@eqx.filter_jit
def loss_fn30(arrs, X, Y,lvl):  
    model = eqx.combine(uf(arrs), sts)  
    pred = v_pred(model, X)  
    loss = pred - Y 
    bce    = jnp.square(loss)       
    return  jnp.mean(bce)
    
@eqx.filter_jit
def make_fn(X, Y, arrs, lvl):
    grads = jax.grad(loss_fn30)(arrs, X, Y, lvl) 
    return grads

@eqx.filter_jit
def make_step02(X, Y, LR, arrs, lvl, state):  
    gradss = make_fn(X, Y, arrs, lvl)
    updates, state = optimizer1.update(gradss, state, gradss)   
    return LR, gradss, lvl,state

@eqx.filter_jit
def step_function(carry, x_y):
    LR, arrs, wght, state = carry
    XX_slice, YY_slice = x_y
    LR, arrs, wght, state = make_step02(XX_slice, YY_slice, LR, arrs, wght, state)
    return (LR, arrs, wght, state), None   

@eqx.filter_jit
def multiple_steps(XX, YY, LR, arrs, wght, state):
    inputs = jnp.expand_dims(XX, axis=1), jnp.expand_dims(YY, axis=1)
    (LR, arrs, wght, state), _ = jax.lax.scan(step_function, (LR, arrs, wght, state), inputs)
    return LR, arrs, wght, state
    
    LR, arrs, _, opt_state1 = multiple_steps(XX, yy, LR, arrs, wght,opt_state1)

from equinox.

lockwo avatar lockwo commented on June 9, 2024

The loss function is highly dependent on the problem. Some ensembles you could probably just vmap a loss over, others you need to manually inspect each one.

Inre scan, if they model isn't even if the inputs to loop over, there shouldn't be any issues with it in principle

from equinox.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.