GithubHelp home page GithubHelp logo

Comments (4)

sdaulton avatar sdaulton commented on July 28, 2024

Hi @Abrikosoff,

I am guessing that @Balandat intended to the constraint model to be pretrained outside of Ax. If you wanted to use non-linear constraints with scipy, you could implement a custom Acquisition that has a different optimize function that constructs the right non-linear constraint from the fitted model.

An alternative would be to create a new acquisition function that constructs and uses the probabilistic constraint. e.g. EI weighted by the probability that the probabilistic constraint is satisfied. One way to do this would be to make a subclass of (Log)EI that creates the necessary constraint within construct inputs similar to this.

You could then use this acquisition function in a GenerationStrategy that uses it. Parts 3b and 5 of this tutorial show how to do this.

from ax.

Abrikosoff avatar Abrikosoff commented on July 28, 2024

Hi @Abrikosoff,

I am guessing that @Balandat intended to the constraint model to be pretrained outside of Ax. If you wanted to use non-linear constraints with scipy, you could implement a custom Acquisition that has a different optimize function that constructs the right non-linear constraint from the fitted model.

An alternative would be to create a new acquisition function that constructs and uses the probabilistic constraint. e.g. EI weighted by the probability that the probabilistic constraint is satisfied. One way to do this would be to make a subclass of (Log)EI that creates the necessary constraint within construct inputs similar to this.

You could then use this acquisition function in a GenerationStrategy that uses it. Parts 3b and 5 of this tutorial show how to do this.

Hi Sam, thanks a lot for the reply! Actually currently what I'm doing is defining nonlinear constraints and passing them to a GenerationStrategy, something like the following:

local_nchoosek_strategy = GenerationStrategy(
                    steps=[
                        GenerationStep(
                            model=Models.SOBOL,
                            num_trials=num_sobol_trials_for_nchoosek,  # https://github.com/facebook/Ax/issues/922
                            min_trials_observed=min_trials_observed,
                            max_parallelism=max_parallelism,
                            model_kwargs=model_kwargs,
                        ), 
                        GenerationStep(
                            model=Models.BOTORCH_MODULAR,
                            num_trials=-1,
                            model_gen_kwargs={
                                "model_gen_options": {
                                    "optimizer_kwargs": {
                                        "nonlinear_inequality_constraints": [_ineq_constraint],
                                        "batch_initial_conditions": batch_initial_conditions,
                                    }
                                }
                            },
                        ),
                    ]
                )

which I can then pass to my AxClient. My initial idea was to pass estimate_probabilities_of_satisfaction along with _ineq_constraint, which will enable me to do this relatively simply in the Service API. I guess what you mean is that there is no good way to do this if the trained model is required as one of the inputs (as in this case)?

from ax.

sdaulton avatar sdaulton commented on July 28, 2024

Yes that's right. If you need a trained model from Ax, using data collected during the experiment, I would recommend going with one of the two approaches that I mentioned, since then you would have access to the trained model.

from ax.

Abrikosoff avatar Abrikosoff commented on July 28, 2024

Hi @Abrikosoff,

I am guessing that @Balandat intended to the constraint model to be pretrained outside of Ax. If you wanted to use non-linear constraints with scipy, you could implement a custom Acquisition that has a different optimize function that constructs the right non-linear constraint from the fitted model.

An alternative would be to create a new acquisition function that constructs and uses the probabilistic constraint. e.g. EI weighted by the probability that the probabilistic constraint is satisfied. One way to do this would be to make a subclass of (Log)EI that creates the necessary constraint within construct inputs similar to this.

You could then use this acquisition function in a GenerationStrategy that uses it. Parts 3b and 5 of this tutorial show how to do this.

Hi Sam @sdaulton , once again thanks for your reply! I'm preparing to try your alternative suggestion (subclassing LogEI), and I have a few related questions regarding this:

  1. the complete procedure I think is to define a input constructor that subclasses (inherits from?) qLogEI, which means defining a function akin to construct_inputs_qLogEISpecialConstraints (for want of a better name), with the necessary inputs, and passing this to model_kwargs via the botorch_acqf_class keyword in the GenerationStep of a GenerationStrategy? Is that more or less correct?

  2. If the above is correct, and looking at your linked code snippet, I see there's a kwarg entry called constraints; should i pass my nonlinear constraints here? Am confused because from the docstring it seems like the constraints here assume g(x) < 0, which is bit different from the usual nonlinear_inequality_constraint kwargs (which assumes g(x) > 0) which one passes to model_gen_kwargs. In addition, if i pass my nonlinear constraints here, do I need to pass them again to model_gen_kwargs in the GenerationStep?

  3. And if both the above questions are clarified, is the model from the Model keyword in the input constructor the model which i can use in the probability constraint?

Once again, thanks a lot for taking time out to help!

from ax.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.