GithubHelp home page GithubHelp logo

Comments (9)

lena-kashtelyan avatar lena-kashtelyan commented on April 18, 2024 1

@yjhong89, hi again!

I am still somewhat confused about your setup –– please correct if my understanding below is wrong:

  1. You already have a set of 200 evaluation results for some parameter configurations in your search space,
  2. You are not looking to evaluate more configurations, but rather to find the best one among your results,
  3. Ideally, you want all trials suggested via BayesOpt, to fall into the pool of 200 configurations you already have evaluated?

If this is the setup, then what is the expected use of BayesOpt here?


Re: TPE, we do not use it, the surrogate model is always Gaussian process.
Re: BayesOpt on discrete spaces –– the parameter values being ordered will be helpful for the algorithm's performance. Even in the unordered case, BayesOpt should still work mechanistically since we are doing a one-hot encoding. However, it would probably not work particularly well in either case; a tree-based or quasi-random approach would probably work better.

from ax.

kkashin avatar kkashin commented on April 18, 2024

Hey @yjhong89 - great question.

We do support certain forms of constraints on the search space, but they do not generally allow for the kind of flexibility that I think you want in this case, if I understand you correctly. You're really asking for an ad hoc, non programmatically generated, set of initial arms for your initial exploration round, which you will then run the bandit algorithm on top of.

I've attached an example notebook that adapts the bandit tutorial to demonstrate how one can manually select the initial arms instead of using a full-factorial design (which will take combinations of all the arms in the search space) for the initial exploration trial / round, and then run the bandit algorithm.

See the ipynb and html of notebook here:
bandit_adhoc.zip.

Can you let me know if this addresses your question?

Longer term, we're working on improving our APIs to better support ad hoc search spaces.

from ax.

yjhong89 avatar yjhong89 commented on April 18, 2024

@kkashin - Thank you for your answer.

I will try your solution and let you know.

Thanks.

--Yet another question--
If I have a same problem (ad hoc search space configuration) and want to do bayesian optimization, how can I do that?

I am looking for bayesian tutorial but bayesian optimization seems not to have 'arm' concept as the bandit optimization.

I tried in this way. (reference from https://github.com/facebook/Ax/blob/master/tutorials/gpei_hartmann_service.ipynb)

ax = AxClient()
ax.create_experiment(....)

for i in range(20):
    params, trial_index = ax.get_next_trial()
    if params not in (my_search_space):
        ax.log_trial_failure(trial_index=trial_index)
        continue
    ax.complete_trial(...)

By using ax.log_trial_failure API, I can skip evaluations for those parameters not having real function values.
However, I think there is a better way to do this.

Thank you.

from ax.

2timesjay avatar 2timesjay commented on April 18, 2024

@yjhong89 if the unknown areas of your search space can be represented as linear constraints (the valid area is a convex linear polytope) then you can use ParameterConstraints. These are accepted as arguments to the Search Space.

If your invalid areas can't be represented this way (if they're "holes" in the search space for instance) then your solution is pretty similar to any other options I could suggest. Creating your own RandomModel would be a reasonable but heavyweight option that would let you implement custom rejection sampling.

from ax.

lena-kashtelyan avatar lena-kashtelyan commented on April 18, 2024

@yjhong89, you can also do Bayesian optimization on search spaces that consist of ChoiceParameter-s. However, this may not be the most reasonable solution and you would still have to mark some trials as failed right away if they cannot be described as constraints and you don't want to just exclude those parameter values from the search space.

To find which solution is best, may I ask a bit more about your use case? How many total valid combinations are in your search space? Are all of your parameters choice and how many are there? What percentage of the total number of combinations you know to be invalid?

from ax.

yjhong89 avatar yjhong89 commented on April 18, 2024

@2timesjay
Thanks for your suggestion.
I tried to constraint my search space, but I think it is not proper in my case since empty combinations in my search space are not represented in convex representation.

@lena-kashtelyan
I already tried ChoiceParameter but this case is similar to IntRangeParameter in that I would still have to reject manually some inputs.
My case consists of 8 input parameters each of which have 3 cases (0,1,2) so 3**8=6561 cases in total.
But I have results for only 200 results since function evaluation for each input case is very slow.
I tried to use Ax to apply bayesian optimization for cases I have in now.
By the way, does bayesian optimization with gaussian process surrogate model work well with discrete input cases? I think it becomes tricky due to kernel representation.
Or does Ax support tree parzen estimator surrogate model used in hyperopt?

Thank you.

from ax.

yjhong89 avatar yjhong89 commented on April 18, 2024

@lena-kashtelyan, Thanks for your answer.

You understand correctly. You are right.
But the expected use of BayesOpt is not important to me.
I just wanted to check how Ax works in discrete and holed search space technically,
since in my domain, there are situations where some parameter combinations in search space is not necessary to evaluate.

Thanks.

from ax.

lena-kashtelyan avatar lena-kashtelyan commented on April 18, 2024

@yjhong89, would you say I fully answered your question or is there anything else you would like me to clarify?

from ax.

yjhong89 avatar yjhong89 commented on April 18, 2024

@lena-kashtelyan
I am satisfied for your answers and good for closing this issue.

from ax.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.