GithubHelp home page GithubHelp logo

Hierarchical search spaces about ax HOT 12 CLOSED

facebook avatar facebook commented on April 16, 2024 12
Hierarchical search spaces

from ax.

Comments (12)

lena-kashtelyan avatar lena-kashtelyan commented on April 16, 2024 4

@yonatanMedan, @LyzhinIvan, @Tandon-A, @riyadparvez, BayesOpt mode is supported in alpha-mode now and currently works through search space flattening (so the Gaussian Process model is not aware of the hierarchical structure of the search space under the hood). cc @dme65 to say more about when BayesOpt over flattened search spaces is effective

If you try it, please let us know how it goes for you (ideally in this issue)! Updated version of my example above that should le you run BayesOpt:

from ax.service.ax_client import AxClient, ObjectiveProperties
from ax.service.utils.report_utils import exp_to_df
from ax.utils.measurement.synthetic_functions import branin

ax_client = AxClient()
ax_client.create_experiment(
    parameters=[
        {
            "name": "model",
            "type": "choice",
            "values": ["Linear", "XGBoost"],
            "dependents": {
                "Linear": ["learning_rate", "l2_reg_weight"],
                "XGBoost": ["num_boost_rounds"],
            },
        },
        {
            "name": "learning_rate",
            "type": "range",
            "bounds": [0.001, 0.1],
            "log_scale": True,
        },
        {
            "name": "l2_reg_weight",
            "type": "range",
            "bounds": [0.00001, 0.001],
        },
        {
            "name": "num_boost_rounds",
            "type": "range",
            "bounds": [0, 15],
        },
    ],
    objectives={"objective": ObjectiveProperties(minimize=True)},
    # To force "Sobol" if BayesOpt does not work well (please post a repro into 
    # a GitHub issue to let us know, it will be great help in debugging this faster!)
    # choose_generation_strategy_kwargs={"no_bayesian_optimization": True},
)

from ax.

riyadparvez avatar riyadparvez commented on April 16, 2024 3

Yes, this would be a great addition! I have a similar usecase - after hyperparameter optimization choose the right threshold for classification.

from ax.

lena-kashtelyan avatar lena-kashtelyan commented on April 16, 2024 2

This is in progress now, and it's already possible to run experiments over hierarchical search spaces via Developer and Service APIs (the functionality is part of main branch currently and will be added to the next stable release). There are some constraints: a hierarchical search space must be a valid connected tree with one root parameter and currently only Sobol is supported as we develop proper modeling support for this. But it is possible to run a basic optimization over a hierarchical search space like so:

from ax.service.ax_client import AxClient, ObjectiveProperties
from ax.service.utils.report_utils import exp_to_df
from ax.utils.measurement.synthetic_functions import branin

ax_client = AxClient()
ax_client.create_experiment(
    parameters=[
        {
            "name": "model",
            "type": "choice",
            "values": ["Linear", "XGBoost"],
            "dependents": {
                "Linear": ["learning_rate", "l2_reg_weight"],
                "XGBoost": ["num_boost_rounds"],
            },
        },
        {
            "name": "learning_rate",
            "type": "range",
            "bounds": [0.001, 0.1],
            "log_scale": True,
        },
        {
            "name": "l2_reg_weight",
            "type": "range",
            "bounds": [0.00001, 0.001],
        },
        {
            "name": "num_boost_rounds",
            "type": "range",
            "bounds": [0, 15],
        },
    ],
    objectives={"objective": ObjectiveProperties(minimize=True)},
    # To force "Sobol" if BayesOpt does not work well (please post a repro into 
    # a GitHub issue to let us know, it will be great help in debugging this faster!)
    # choose_generation_strategy_kwargs={"no_bayesian_optimization": True},
)


def contrived_branin(parameterization):  # branin domain: x1 in [-5., 10.], x2 in [0., 15.]
    if parameterization.get("model") == "Linear":
        lr = parameterization.get("learning_rate")
        l2_reg = parameterization.get("l2_reg_weight")
        
        print(f"Computing Branin with x1={lr * 100}, x2={l2_reg * 1000} (`Linear` model case)")
        return branin(lr * 100, l2_reg * 1000)
    
    if parameterization.get("model") == "XGBoost":
        num_boost_rounds = parameterization.get("num_boost_rounds")
        
        print(f"Computing Branin with x1={num_boost_rounds-5}, x2={num_boost_rounds} (`XGBoost` model case)")
        return branin(num_boost_rounds-5, num_boost_rounds)
    
    raise NotImplementedError

for _ in range(20):
    params, trial_index = ax_client.get_next_trial()
    
    ax_client.complete_trial(
        trial_index=trial_index, 
        raw_data=contrived_branin(params)
    )

exp_to_df(ax_client.experiment)

We don't have a specific estimate over when our BayesOpt algorithms will support this functionality, but it should be within a few months.

cc @yonatanMedan, @LyzhinIvan, @Tandon-A, @riyadparvez

from ax.

LyzhinIvan avatar LyzhinIvan commented on April 16, 2024 1

Great, thanks! I'm looking forward to supporting BayesOpt mode.

from ax.

lena-kashtelyan avatar lena-kashtelyan commented on April 16, 2024 1

I'll close this issue as inactive, but the example above it functional for anyone who wants to try it!

from ax.

ldworkin avatar ldworkin commented on April 16, 2024

Hi @yonatanMedan! Great question. We don't currently support this, but it's on our roadmap to support in the next few months. I'll let you know when it's ready!

from ax.

Tandon-A avatar Tandon-A commented on April 16, 2024

This enhancement would be super helpful in my use case where I want to experiments with different learning rate schedulers, where the parameters used by the schedulers are different.

from ax.

LyzhinIvan avatar LyzhinIvan commented on April 16, 2024

Hi! Are there some estimates when this functionality will be available?

from ax.

ldworkin avatar ldworkin commented on April 16, 2024

Hi @LyzhinIvan ! Unfortunately, probably not in the immediate short-term. This has been deprioritized in favor of other efforts. However it's certainly still on our roadmap! cc @2timesjay

from ax.

lena-kashtelyan avatar lena-kashtelyan commented on April 16, 2024

We will now be tracking wishlist items / feature requests in a master issue for improved visibility: #566. Of course please feel free to still open new feature requests issues; we'll take care of thinking them through and adding them to the master issue.

from ax.

lena-kashtelyan avatar lena-kashtelyan commented on April 16, 2024

Reopening this issue as it is now in-progress

from ax.

sgbaird avatar sgbaird commented on April 16, 2024

Another use-case for hierarchical search spaces is in the physical sciences with multi-step processing / synthesis. For example, there are 4 pieces of equipment, A, B, C, and D, where A always comes first and D always comes last, and in the middle you can choose either B or C, where B and C have distinct equipment parameters. Another case is where you can omit the second processing step altogether.

B or C might look like a surface preparation step that involves two different types of surface preparation: plasma etching vs. chemical etching. I think often (at least in academia), multi-step/multi-path synthesis routes are reduced to single-path optimizations that operate largely independently from one another despite sharing common traits. I think featurization of complex synthesis routes is still an ongoing question. The examples that treat complex synthesis routes generally fall into the category of natural language processing (e.g. ULSA, Roborxn).

How does the flattening of the search space in the above example work? One option would be to add additional boolean variables that describe whether a given branch is active or not and/or setting the inactive parameters to particular values.

so the Gaussian Process model is not aware of the hierarchical structure of the search space under the hood

It sounds like adding variables that encode the hierarchy isn't the approach that's taken here; however, the inactive variables would still need to be assigned values, correct? How is this handled currently?

I wonder if there's some possibility of representing hierarchical search spaces as flattened search spaces with non-linear constraints. I'd need to give that one some more thought.

(not time-sensitive for me atm)

from ax.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.