GithubHelp home page GithubHelp logo

Comments (11)

sfalkner avatar sfalkner commented on August 14, 2024

Hey,
thanks for trying our package.
The fact that all configs were not picked by the model seems to suggest that there is something wrong. Could you please post your ConfigSpace defining the search space? Otherwise I can't tell you what might have happened.
Best, Stefan

from hpbandster.

gchlebus avatar gchlebus commented on August 14, 2024

Hi Stefan,

Please find below my ConfigSpace definition. Thanks for taking a look at it.

import ConfigSpace as cs

def get_configspace():
  config = cs.ConfigurationSpace()
  config.add_hyperparameter(cs.Constant('training_batchsize', 1))
  config.add_hyperparameter(cs.Constant('subbatch_validation', 1))
  config.add_hyperparameter(cs.Constant('validation_batchsize', 100))
  config.add_hyperparameter(cs.Constant('validation_interval', 400))
  config.add_hyperparameter(cs.CategoricalHyperparameter('loss_function', ['dice', 'categorical_cross_entropy',
                                                                           'dice_allchannels', 'top-k(5)']))
  config.add_hyperparameter(cs.UniformFloatHyperparameter('learning_rate', lower=1e-6, upper=1e-3, log=True))
  config.add_hyperparameter(cs.UniformFloatHyperparameter('arch_spatial_dropout', lower=0, upper=0.8,
                                                          default_value=0.5, log=False))
  config.add_hyperparameter(cs.CategoricalHyperparameter('arch_activation', ['relu', 'leaky_relu']))
  return config

from hpbandster.

sfalkner avatar sfalkner commented on August 14, 2024

Oh I see,
Could you try to remove the constants from your configspace.
I know the ConfigSpace has those, but I have never tested them, and so they probably don't work for BOHB. The reason it still runs through comes from the fact that BOHB will fall back to Hyperband if the configuration space has not supported features. You would see some messages in the debug output, but I guess I should make them warning that are always printed.
Sorry about that. Let me know if that fixes your problem.

from hpbandster.

gchlebus avatar gchlebus commented on August 14, 2024

Great, thanks for the hint! I just started a second run without any Constants in my configspace. I will let you know whether it helped.

Could you tell me what is the meaning of the n_iterations parameter passed to the BOHB.run method? I was unable to derive the meaning of this argument from the source code.

from hpbandster.

gchlebus avatar gchlebus commented on August 14, 2024

Hi Stefan,
unfortunately, removal of Constants from the ConfigSpace didn't resolve the problem. Please find attached the configspace definition, the sampled configs and results logged by the result logger (attached them as txt since github doesn't support json files). I started the optimization with min_budget=2 and max_budget=10.
Do you have any idea, why still none of my configs was sampled from the model?

results.txt
config.txt
configspace.txt

from hpbandster.

sfalkner avatar sfalkner commented on August 14, 2024

Hey,

Oh I see what's going on. So here is what's happening:

  • you only have two budgets: 3.33 and 10 (a result of your min and max budegt and the eta=3 default)
  • as a consequence, you have two types of iterations, one that tries 3 configs on the small budget, and advances one to the large one, and another one that simply tries two configs on the largest budget.
  • given your config space dimensionality (4 parameters), BOHB will start building a model after 6 evaluations on any budget.
  • you do 4 iterations which results in 6 evaluations on both budgets actually (Your scenario with only two budgets might not give you much speed up as you evaluate not many more configurations on the smaller budget)

So, the very next iteration should contain configurations sampled from the model. You can reuse old runs by putting them into the model before you start BOHB again. See
this example.
Let me know if you have any trouble with that.

Best,
Stefan

from hpbandster.

gchlebus avatar gchlebus commented on August 14, 2024

Hi Stefan,

Many thanks for your help. I just started next optimization run and will let you know once it ends. Could you elaborate a bit more on the types of iterations? What types of iterations are there and how the iteration type is chosen? I would appreciate a short explanation or a pointer to a code.

from hpbandster.

sfalkner avatar sfalkner commented on August 14, 2024

So Hyperband and BOHB do a round robin on different iterations that trade of an aggressive minimal budget with many configurations vs a conservative budget with only a few configurations.
I like the explanation in the original Hyperand post, although their parametrization is slightly different.
So the basic idea is to have the first iteration start with the minimal budget and increase that by factors of eta (=3) and cutting down the number of configurations by the same factor. The next iteration will increase the smallest budget which means that fewer configurations are evaluated, but with a higher fidelity right away.
This goes on until the iteration that just evaluates configurations on the largest budget. This is implemented in the code here. You only need to understand what self.budgets and what the ns variable in get_next_iteration are. Those will tell you how many configurations (ns) are evaluated at which budget.
BTW, I updated the FAQ to include your questions here.
Let me know if you any more questions, or close the issue if you are happy :)

from hpbandster.

gchlebus avatar gchlebus commented on August 14, 2024

As you expected, the next iteration picked some configurations from the model. Thank you very much for your help and the explanation of BOHP internals. Closing the issue.

from hpbandster.

gui-miotto avatar gui-miotto commented on August 14, 2024

So, does this mean that BOHB fits a different model for each budget?
In other words, I'll only start evaluating (1 - random_fraction) of model-sampled configurations after min_points_in_model configurations have been evaluated for each budget K. Is that it?

from hpbandster.

gui-miotto avatar gui-miotto commented on August 14, 2024

Just for the record, config_space Constants work fine with BOHB. I've tested this.

from hpbandster.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.